LOGO

Adam Mosseri Defends Instagram's Teen Safety Record to Congress

December 8, 2021
Adam Mosseri Defends Instagram's Teen Safety Record to Congress

Instagram Under Scrutiny: Testimony Before Congress

Adam Mosseri, the head of Instagram, appeared before Congress on Wednesday for the first time, addressing concerns regarding the app's influence on adolescents and its plans for expanding to a younger demographic.

Leaked documents originating from Facebook whistleblower Frances Haugen revealed internal awareness of the platform’s potential negative effects on vulnerable users. These disclosures sparked the congressional hearing.

Internal research indicated that 32% of teenage girls reported feeling worse about their bodies due to Instagram. Furthermore, studies showed a correlation between suicidal ideation and Instagram usage, with 13% of British and 6% of American teen users linking their feelings to the platform.

This research, initially conducted internally by Meta (formerly Facebook), came to light through reports published in The Wall Street Journal. Lawmakers repeatedly referenced these documents during Wednesday’s hearing, seeking further access to Instagram’s internal assessments of its impact on young people.

Concerning Discoveries

Richard Blumenthal, the subcommittee chair (D-CT), stated in his opening remarks that a recent test by his staff quickly uncovered algorithmic recommendations promoting harmful content. Specifically, pro-anorexia and eating disorder content was readily accessible.

Marsha Blackburn’s office also conducted a test, creating a teen account that defaulted to “public” visibility, contrary to Instagram’s stated policy for users under 16. Mosseri acknowledged a failure in implementing safety measures for accounts created on the web.

Blackburn (R-TN) expressed frustration, noting this was the fourth time in two years Meta representatives had been questioned, with little discernible change. She stated that the conversations felt repetitive and unproductive.

Mosseri defended the platform, echoing Meta’s response to previous criticism. He disputed some of the findings, even those that seemed intuitively valid. When questioned about Instagram’s potentially addictive nature, he asserted that research did not support the claim.

Antigone Davis, Facebook’s Global Head of Safety, previously testified before the subcommittee, emphasizing the safety measures in place for users aged 13 to 17. She argued the company was actively working to create a secure environment.

Meta consistently defended its practices, asserting that existing precautions were sufficient and that the leaked research had been misinterpreted. A Facebook research leader maintained that the research did not demonstrate Instagram was inherently “toxic” for teenage girls.

In response to the growing criticism, Mosseri announced a pause on the development of Instagram Kids, an app designed for children under 13. The company continues to face scrutiny from mental health professionals and lawmakers who question its responsibility regarding the well-being of children and teenagers.

Mosseri reiterated the argument that children are already using the platform despite age restrictions. He suggested a dedicated app for younger users would provide a safer, more controlled experience. He acknowledged that Instagram was not originally intended for this age group.

Proposed Self-Regulation by Meta

Mosseri proposed the creation of an “industry body” to establish best practices for age verification, parental controls, and product design for young users. He also indicated Instagram’s willingness to adhere to rules set by this body, potentially forfeiting some Section 230 protections.

Blumenthal criticized the self-regulation proposal, questioning the enforcement mechanisms. Mosseri hesitated to endorse Blumenthal’s suggestion of U.S. Attorney General oversight. Blumenthal concluded that relying on trust was no longer a viable solution.

Representatives from YouTube, Snap, and TikTok testified before Congress in October, focusing on contrasting their policies with those of Facebook. Blumenthal dismissed comparisons, stating that simply being different from Facebook was not a sufficient defense.

Instagram recently began testing “Take a Break,” a feature prompting users to take breaks from the app. Alongside this, the company announced the rollout of initial parental controls in March 2022, allowing parents to monitor and limit app usage, though these controls are less comprehensive than those offered by competitors like TikTok.

#adam mosseri#instagram#teen safety#congress#social media#hearing