LOGO

Facebook Documents Reveal Moderation & Misinformation Issues

October 25, 2021
Facebook Documents Reveal Moderation & Misinformation Issues

The Fallout from the Facebook Papers: A Deep Dive

A substantial collection of internal documents, known as the Facebook Papers, has been released to the public through a collaborative effort involving multiple news organizations. These documents, provided by former employee Frances Haugen, reveal a company allegedly prioritizing market dominance and profitability over the well-being of its users. Despite internal warnings regarding the potential for real-world harm stemming from its engagement-focused strategies, these concerns appear to have been consistently downplayed.

Concerns Over Amplified Hate Speech

The Washington Post reports that Facebook CEO Mark Zuckerberg seemingly minimized the extent to which the platform amplified hate speech during congressional testimony. However, internal documentation suggests he possessed knowledge of a much wider problem than publicly acknowledged. Specifically, the social network reportedly removed less than 5% of all hate speech, with executives – including Zuckerberg – understanding Facebook’s role in increasing societal polarization. Facebook has countered these assertions, stating the documents are being misinterpreted.

Suppressed Voter Registration and COVID-19 Misinformation

Zuckerberg is also facing accusations of blocking a Spanish-language voter registration initiative prior to the 2020 US elections, citing potential perceptions of partisanship. Staff members within WhatsApp subsequently proposed a revised, less comprehensive version implemented in partnership with external organizations. Furthermore, the CEO allegedly opposed stricter measures against COVID-19 misinformation early in the pandemic, fearing a negative impact on “MSI [Meaningful Social Interaction]” – a key internal metric. Facebook disputes this claim, asserting the documents have been mischaracterized.

Neglect of Developing Nations

According to Reuters, Facebook has consistently under-resourced content moderation efforts in numerous developing countries, leading to the proliferation of hate speech and extremist content. This shortfall includes insufficient hiring of moderators proficient in local languages and possessing a nuanced understanding of cultural contexts. Consequently, the company places undue reliance on automated moderation systems that prove ineffective outside of English-speaking regions. Facebook refutes this accusation, maintaining its commitment to users globally.

Myanmar and Ethiopia: Case Studies in Moderation Failures

Myanmar is specifically highlighted as a region where Facebook’s actions have exacerbated existing tensions. A 2020 internal document indicated the platform’s automated systems were unable to identify problematic terms in the Burmese language. This failure echoes concerns raised in a 2018 report by Business for Social Responsibility regarding Facebook’s inadequate response to prevent civil unrest in Myanmar.

Similarly, Facebook reportedly lacked the necessary tools to detect hate speech in the Ethiopian languages of Oromo and Amharic. The company states it is actively expanding its content moderation team and has recently added speakers of Oromo, Amharic, Burmese, and other languages.

The Role of "Likes" and "Shares"

The New York Times reports that Facebook’s internal research recognized the role of core platform features – the Like and Share buttons – in accelerating the spread of hate speech. An internal document, entitled “What Is Collateral Damage,” suggests that the company’s inaction would result in it “actively (if not necessarily consciously) promoting these types of activities.” Facebook contends that these statements are based on flawed assumptions and that deliberately harming users would be illogical.

Declining Engagement and Metric Misrepresentation

Bloomberg has reported on a potential decline in Facebook’s engagement metrics. Younger demographics, crucial for attracting advertisers, are reportedly spending less time on the platform, and fewer teenagers are registering. Furthermore, the number of users in these age groups may be inflated due to the creation of multiple accounts – often referred to as “Finstas” – used to maintain separate online personas. Haugen alleges that Facebook “has misrepresented core metrics to investors and advertisers,” and that these duplicate accounts contribute to “extensive fraud” against advertisers. Facebook states it already informs advertisers about the possibility of reaching duplicate accounts and discloses this issue in its SEC filings.

Looking Ahead: More Scrutiny Expected

Sir Nick Clegg, Facebook’s head of global affairs, reportedly warned that the company should anticipate “more bad headlines” in the coming weeks. With the ongoing release of documents from the Facebook Papers, further testimony from Frances Haugen in the UK, and the potential emergence of additional whistleblowers, Facebook is likely to remain under intense scrutiny for the foreseeable future.

Editor’s note: This article originally appeared on Engadget.

#Facebook#misinformation#content moderation#internal documents#social media#leaks