Parents Protest Meta Over Online Harms to Children

Families Demand Accountability from Meta Following Online Harms
Despite Meta’s efforts to thwart bipartisan legislation aimed at bolstering online child protection, sustained pressure from parents impacted by online harm continues to mount against social media corporations.
Vigil Held to Honor Lost Children
On Thursday, a vigil was conducted outside a Meta office in Manhattan by 45 families who have experienced the loss of children due to online dangers. These dangers ranged from cyberbullying to sextortion. The event served as a memorial for their children and a plea for action and accountability from the company.
Attendees, many clad in white and carrying roses, displayed signs proclaiming “Meta profits, kids pay the price” alongside framed photographs of their deceased children. This poignant scene stood in stark contrast to the pleasant spring weather in New York City.
Shared Experiences of Being Ignored
While each family’s circumstances differ, a commonality unites them: a feeling of being disregarded by technology companies when they attempted to report incidents impacting their children. This observation comes from Sarah Gardner, CEO of the child safety advocacy group Heat Initiative, and one of the event’s organizers, as reported to TechCrunch.
Personal Stories of Loss
Perla Mendoza shared the heartbreaking story of her son, who succumbed to fentanyl poisoning after acquiring drugs through a dealer on Snapchat. She is among numerous parents who have initiated legal action against Snap, alleging insufficient measures to prevent illicit drug transactions on the platform, both before and after her son’s death.
Mendoza reported a dealer advertising hundreds of pills on Snap, but claims it took the company eight months to flag the account. She further revealed that the same dealer was active on Facebook and Instagram, utilizing multiple accounts.
Whistleblower Testimony and Past Concerns
The vigil occurred following recent disclosures from whistleblower Sarah Wynn-Williams, detailing Meta’s practice of targeting advertisements towards individuals aged 13 to 17 who were experiencing feelings of sadness or depression. It also follows the 2021 publication of The Facebook Files by The Wall Street Journal, which revealed the company’s awareness of Instagram’s detrimental effects on the mental health of teenage girls, despite public downplaying of the issue.
Demands for Change
Organizers, including ParentsTogether Action and Design It for Us, presented an open letter to Zuckerberg, bearing over 10,000 signatures. The letter calls for Meta to cease the promotion of harmful content to children, encompassing sexualizing material, racism, hate speech, and content promoting eating disorders.
It also demands the prevention of sexual predators and other malicious actors from exploiting Meta platforms to target children, alongside the provision of transparent and swift resolutions to reports of problematic content or interactions.
The letter was placed among rose bouquets outside Meta’s Wanamaker Place office as protesters chanted, “Build a future where children are respected.”
Meta’s Existing Safety Measures
Over the past year, Meta has implemented new safety protocols for children and teenagers on Facebook and Instagram. These include collaboration with law enforcement and other tech companies to combat child exploitation. The introduction of Teen Accounts on Instagram, Facebook, and Messenger limits contact options and content visibility for younger users.
Instagram has also begun utilizing artificial intelligence to identify teenagers who misrepresent their age to circumvent safety features.
Meta’s Response
Sophie Vogel, a Meta spokesperson, stated to TechCrunch that the company recognizes parental concerns regarding online safety. She highlighted the significant changes made to the Instagram experience for teens with Teen Accounts, which address key parental concerns. Vogel noted that 94% of parents find these features helpful.
She also mentioned the development of safety features to prevent abuse, such as warnings when teens interact with individuals from other countries, and a recent partnership with Childhelp to launch an online safety curriculum for middle school students.
Concerns About Adequacy of Safeguards
Gardner contends that Meta’s current actions are insufficient to address the existing safety vulnerabilities. She points out that, despite stricter messaging policies for teens, adults can still initiate contact through post comments and request friendship.
“Researchers have been able to create profiles posing as 12- or 13-year-olds and quickly encounter extremist, violent, or sexualized content,” Gardner explained. “This demonstrates that the current measures are ineffective and inadequate.”
Meta Disputes Claims
Vogel countered that Teen Accounts are private by default, and users under 16 require parental permission to change this setting, preventing unsolicited contact from adults. She also asserted that individuals providing a birthdate under 13 would be unable to create an account, thus preventing researchers from establishing profiles as young teens.
Shifting Content Moderation Policies
Gardner further criticized Meta’s recent adjustments to its fact-checking and content moderation policies, favoring community notes, as a sign of diminishing responsibility.
Opposition to the Kids Online Safety Act
Meta, along with its lobbying efforts, actively opposed the Kids Online Safety Act, which ultimately failed to pass Congress in late 2024. The bill, which had already been approved by the Senate, aimed to establish regulations for social media platforms to mitigate addiction and mental health harms.
Call for Parental Involvement
“The purpose of today is to demonstrate to Mark Zuckerberg that parents are deeply concerned about this issue, not only those who have lost children, but also other Americans who are becoming aware of this reality and questioning whether he should be making decisions about their children’s online safety,” Gardner concluded.
This story has been updated with additional information from Meta.
Related Posts
Nvidia Reportedly Tests Tracking Software Amid Chip Smuggling Concerns

Marco Rubio Bans Calibri Font at State Department - DEI Concerns

EU Antitrust Probe: Google's AI Search Tools Under Investigation

Microsoft to Invest $17.5B in India by 2029 - AI Expansion

India to Charge OpenAI, Google for AI Training on Copyrighted Data
