TikTok, Snapchat, YouTube Face Lawmaker Scrutiny Over Eating Disorder Content

Protecting Children Online: Social Media Companies Face Senate Scrutiny
Today, representatives from TikTok, Snapchat, and YouTube appeared before the Senate Subcommittee on Consumer Protection, Product Safety, and Data Security. The purpose of this hearing was to address the critical issue of safeguarding children while they are online.
This inquiry was prompted by disclosures made by former Facebook employee Frances Haugen, which were leaked to the Wall Street Journal. These documents revealed Facebook’s awareness that Instagram can have a detrimental effect on the well-being of teenage girls. Internal research indicated that 32% of teen girls reported feeling worse about their bodies after using Instagram.
Problem Extends Beyond Facebook
While the Senate is focused on holding Facebook accountable for its impact on young girls, legislators recognize that the problem is not limited to Mark Zuckerberg’s company. Each of the companies present at the hearing maintains policies against content promoting eating disorders. However, Senators presented evidence from their constituents detailing instances of teenagers on these platforms suffering from conditions like anorexia and bulimia.
Senator Blumenthal, chair of the committee, highlighted this issue in his opening remarks. His office created a test account mimicking a teenager on YouTube and easily discovered videos related to extreme dieting and eating disorders. Subsequently, the account was presented with further content on these topics in its recommendations. He described this as a “rabbit hole” with no easy escape.
Investigations Reveal Troubling Content
Similar findings were reported regarding TikTok. The Wall Street Journal conducted an investigation using 31 bot accounts, simulating users aged 13 to 15. Despite TikTok’s ban on content glorifying eating disorders, the investigation revealed that these accounts were still shown several such videos.
Senator Klobuchar questioned Michael Beckerman, TikTok’s head of Public Policy for the Americas, regarding the platform’s efforts to prevent the promotion of content related to eating disorders, drugs, and violence to teenagers.
TikTok's Response and Transparency
Beckerman expressed disagreement with the Wall Street Journal’s methodology, noting the use of bots programmed to seek out specific content. However, he affirmed that TikTok has improved user controls for algorithm management and age-appropriate content visibility.
He stated that content concerning drugs violates community guidelines and that 97% of content breaching minor safety policies is proactively removed. A recently published transparency report supports these figures, detailing content removal between April and June 2021. The report indicated proactive removal of 97.6% of content violating minor safety policies, with 93.9% of those videos having zero views. For content related to “suicide, self-harm and dangerous acts” – including eating disorder promotion – 94.2% was proactively removed, and 81.8% of videos had zero views.
Research and Internal Studies
Senator Klobuchar further inquired whether TikTok had conducted research on the platform’s potential to promote eating disorder content to teens, and if Beckerman had requested any internal studies on the topic prior to the hearing. Beckerman answered negatively to both questions, but reiterated TikTok’s collaboration with external experts.
Company Efforts to Address the Issue
Senator Baldwin asked each company to detail their strategies for removing content that promotes unhealthy body image and eating disorders, and for directing users to supportive resources. Her focus was specifically on younger users.
Beckerman emphasized TikTok’s “aggressive” removal of such content and its collaboration with organizations to support users in need. He referenced TikTok’s recent expansion of mental health resources, initiated after Instagram faced criticism for its impact on teen girls. This included a memo within TikTok’s Safety Center addressing the impact of eating disorders, developed in partnership with the National Eating Disorders Association (NEDA).
TikTok also prohibits ads targeting users based on weight loss. Policies were updated in September 2020 to ban ads for fasting apps and weight loss supplements, and to increase restrictions on ads promoting negative body image. This change followed a Rolling Stone report about TikTok advertising fasting apps to teenage girls. However, weight management product ads are still permitted for users over 18.
Snapchat and YouTube's Policies
Jennifer Stout, Snapchat’s Vice President of Global Public Policy, stated that content promoting eating disorders violates their community guidelines. Snapchat directs users searching for terms like “anorexia” or “eating disorder” to relevant expert resources.
Snapchat’s ad policies do not ban diet and weight loss ads outright, but certain content is prohibited. Ads cannot promote weight loss supplements, make exaggerated claims, or display “before and after” weight loss images.
Leslie Miller, YouTube’s vice president of Government Affairs and Public Policy, also confirmed that YouTube prohibits content glorifying eating disorders. Their ad policy allows weight loss ads as long as the imagery is not disturbing.
Potential for Positive Content
Representatives from both TikTok and YouTube highlighted the potential for social media to provide support, such as videos documenting recovery from eating disorders. This content can be empowering and help teens feel less alone.
Miller stated that YouTube’s algorithms prioritize content offering positive support when users search for eating disorder-related terms. She noted that over 90% of guideline violations are detected through technology, with human moderators also playing a role.
Concerns Remain
Toward the hearing’s conclusion, Senator Blumenthal revisited his opening statement, referencing his office’s ability to quickly find banned content on TikTok using fake teenage accounts.
“How do you explain to parents why TikTok is inundating their kids with these kinds of videos of suicide, self-injury and eating disorders?” Senator Blumenthal asked.
Beckerman responded that he could not comment on the specific examples from the Senator’s staff, but assured that this was not the typical user experience on TikTok.
Skepticism and Future Legislation
Despite the companies’ presentation of their ad policies and content moderation efforts, Senators expressed skepticism about their willingness to cooperate with legislation aimed at enhancing social media safety for children.
Senator Blumenthal concluded the hearing by stating that he would not accept the day’s testimony at face value. “The time for platitudes and bromides is over,” he declared.




