LOGO

FDA Regulation of Instagram Algorithm - A Call for Action

September 15, 2021
FDA Regulation of Instagram Algorithm - A Call for Action

Instagram's Detrimental Impact on Teen Mental Health

Recent reporting by the Wall Street Journal has confirmed long-suspected concerns regarding Instagram’s effects on adolescent well-being. The platform is demonstrably linked to negative mental health outcomes, even contributing to suicidal ideation in vulnerable young people.

Data Reveals a Troubling Trend

Specifically, 32% of teenage girls experiencing body image issues report that Instagram exacerbates these feelings. Furthermore, the WSJ report indicates that 13% of British and 6% of American teens with suicidal thoughts attribute those thoughts, at least in part, to their Instagram usage.

It’s important to note that these figures represent Facebook’s own internal data, suggesting the actual impact may be even more severe.

A Call for Regulation: The FDA's Role

Drawing Parallels to the Food and Drug Administration

The formation of the Food and Drug Administration in 1906, spearheaded by President Theodore Roosevelt and Congress, stemmed from the failure of industries to prioritize public welfare. Similarly, Instagram’s prioritization of unattainable lifestyles, as showcased at events like the Met Gala, necessitates regulatory intervention.

The time has come for the FDA to assert its authority over the algorithms that drive Instagram’s addictive and potentially harmful features.

Algorithms as Drugs: A Novel Approach

The FDA should classify algorithms as impacting mental health, akin to pharmaceutical drugs. The Federal Food, Drug and Cosmetic Act defines drugs as substances intended to alter the structure or function of the body. Instagram’s technology demonstrably alters brain function, according to its own research.

Should this approach prove unsuccessful, Congress and President Biden should consider establishing a dedicated mental health FDA.

Understanding Facebook's Prioritization

The Need for Transparency

The public deserves insight into the priorities guiding Facebook and Instagram’s algorithms. Our government already conducts clinical trials to assess the safety of products with potential physical harm.

Researchers should be empowered to study Facebook’s algorithmic choices and their subsequent impact on mental well-being. This is feasible because Facebook is already conducting such research, but is choosing to suppress the findings.

The News Feed Experiment

As detailed in “An Ugly Truth,” Facebook briefly prioritized “News Ecosystem Quality” (NEQ) scores in November 2020, favoring trustworthy sources over unreliable ones. This resulted in a “nicer News Feed” with reduced misinformation. However, Mark Zuckerberg reversed this change due to concerns about decreased engagement and potential conservative backlash, ultimately prioritizing profits over public benefit.

"Good for the World" vs. Engagement

Facebook has also investigated the effects of prioritizing content deemed “good for the world” over content designed for maximum engagement. Predictably, engagement declined. This demonstrates Facebook’s awareness of its algorithm’s profound influence on the public’s mindset.

Historical Precedent and the Need for Action

Echoes of "The Jungle"

Upton Sinclair’s “The Jungle” exposed dangerous practices in the food industry, leading to public outrage and the passage of the 1906 Pure Food and Drug Act. The free market failed to protect consumers, necessitating government regulation.

Today, we must regulate the algorithms that impact our mental health. Teen depression rates have risen sharply since 2007, and suicide rates among those aged 10-24 have increased by nearly 60% between 2007 and 2018.

Correlation and Causation

While establishing a direct causal link between social media and these trends is complex, it is unreasonable to deny a contributing factor. Filter bubbles, online bullying, and constant connectivity all contribute to mental health challenges.

Navigating the Complexities of Regulation

Section 230 and the First Amendment

Section 230 of the Communications Decency Act correctly protects internet platforms from liability for user-generated content. Furthermore, a private company like Facebook is not obligated to uphold the First Amendment. However, maintaining public trust requires perceived fairness in content moderation.

Zuckerberg's Equivocation

Mark Zuckerberg has historically hesitated to take decisive action against harmful content, eventually banning Holocaust deniers, Donald Trump, and anti-vaccine activists only after considerable pressure. Facebook’s reactive and cautious approach is insufficient. The company’s primary focus remains engagement and growth, often at the expense of user well-being.

The "Ugly Truth" Memo

Bosworth's Candid Assessment

Andrew “Boz” Bosworth’s 2016 memo, known as “The Ugly,” revealed a starkly honest assessment of Facebook’s impact.

Bosworth acknowledged that Facebook can contribute to suicides and facilitate terrorist organization, yet prioritized growth above all else.

Concentrated Power and its Risks

This concentration of power in a single corporation, controlled by one individual, poses a significant threat to democracy and our way of life.

Addressing Concerns and Moving Forward

The Argument Against Regulation

Critics of FDA regulation will likely argue it represents an overreach of government authority and an infringement on personal liberties. However, what is the alternative to demanding transparency and accountability from Facebook?

Is it acceptable for a company to prioritize sessions, time spent, and revenue growth over the collective mental health of its users?

The Cost of Inaction

Ignoring the problem will not make it disappear. Allowing a single individual to determine what is “right” based on business imperatives is unacceptable. The FDA must step in and decide.

#Instagram algorithm#FDA regulation#social media#mental health#algorithm regulation#social media addiction