LOGO

Meta Drops Fact-Checking, Changes Content Moderation

January 7, 2025
Meta Drops Fact-Checking, Changes Content Moderation

Meta Announces Significant Changes to Content Moderation Policies

Meta, the parent company of Facebook, Instagram, and WhatsApp, has revealed a substantial revision of its content moderation policies. These adjustments involve removing certain safeguards implemented over the past several years.

This policy shift follows criticism suggesting the company inadvertently facilitated the dissemination of political and health-related misinformation.

Key Changes Outlined by Meta

Joel Kaplan, Meta’s newly appointed chief global affairs officer, detailed the changes in a blog post titled “More speech, fewer mistakes.” He described the alterations as an effort to “undo the mission creep” in three primary areas:

  • Meta is discontinuing its collaboration with third-party fact-checkers. The company will transition to a Community Notes system in the coming months.
  • Restrictions on “topics that are part of mainstream discourse” are being lifted. Enforcement will now concentrate on “illegal and high-severity violations,” including terrorism, child sexual exploitation, fraud, drug-related content, and scams.
  • Users will be encouraged to personalize their experience with political content. This will likely result in increased exposure to opinions and perspectives aligned with individual preferences, effectively fostering echo chambers.

Timing and Political Implications

These changes are particularly noteworthy as they precede the inauguration of a new presidential administration in the United States.

Donald Trump and his allies have indicated a broader interpretation of free speech, emphasizing the importance of diverse viewpoints.

Facebook has faced scrutiny from these critics in recent years, notably including a period when Trump himself was banned from the platform due to content moderation decisions.

Evolution of Meta’s Content Moderation

Meta’s content moderation procedures were developed and refined over several years. This followed public and political concerns regarding the spread of election misinformation and inaccurate COVID-19 advice.

The initial fact-checking program was launched in 2016, responding to accusations that Facebook was being exploited to disseminate false information during the U.S. presidential election.

This led to the establishment of an Oversight Committee and the implementation of additional moderation tools. These tools aimed to give users greater control over the content they encountered and to alert Meta to potentially harmful or misleading material.

Criticism of Existing Policies

However, these policies have not been universally accepted. Some argue they are insufficient, while others contend they result in excessive errors and perceived political bias.

Kaplan acknowledged that “experts…have their own biases and perspectives,” which influenced their fact-checking choices. He further stated that “over-enforcing” rules limited legitimate political debate and censored trivial content.

Meta estimates that one to two out of every ten censored items were “mistakes” that did not actually violate established policies.

Shifting Priorities and Internal Changes

Some observers suggest these changes are intended to align Meta with the incoming administration. However, the shift in approach has been developing for some time.

In the past year, even Meta’s commitment to its own rules began to waver. Nick Clegg, the company’s outgoing policy chief, recently admitted to overzealous moderation in an interview.

Furthermore, the Oversight Board has not proven as effective as initially anticipated.

With accountability potentially shifting alongside political tides, Meta appears to be adopting a more hands-off approach.

Emphasis on Free Expression

“Meta’s platforms are built to be places where people can express themselves freely. That can be messy,” Kaplan wrote. “On platforms where billions of people can have a voice, all the good, bad and ugly is on display. But that’s free expression.”

The Oversight Board expressed its welcome for Meta’s revision of its fact-checking approach. The board aims to find a scalable solution to enhance trust, free speech, and user voice on the platforms.

The board also stated its intention to collaborate with Meta in shaping its approach to “free speech in 2025.”

Internal Restructuring at Meta

These developments coincide with broader changes within Meta itself.

CEO Mark Zuckerberg has indicated a desire to collaborate with, rather than oppose, the incoming administration. The company recently appointed three new board members, including Dana White, a supporter of the incoming president.

Additionally, Meta replaced its longtime public affairs head, Nick Clegg, with Kaplan, a known Republican within the company.

Kaplan also announced a relocation of trust and safety teams responsible for content policies and review. These teams will be moved from California to Texas and other U.S. locations.

Updated with further detail about the changes.

#Meta#Facebook#fact-checking#content moderation#social media#news