LOGO

Oversight Board Urges Meta to Address Ethiopia Violence Spread

December 14, 2021
Oversight Board Urges Meta to Address Ethiopia Violence Spread

Facebook's Oversight Board Addresses Misinformation in Ethiopia

The Oversight Board, established by Facebook to scrutinize its policy choices, recently issued a statement regarding a case involving misinformation originating from Ethiopia. The board cautioned the company about the potential dangers of allowing unchecked hate speech and unsubstantiated claims to proliferate within active conflict zones.

Details of the Case

The case centered on a post, written in Amharic and originating from a Facebook user based in Ethiopia. The post accused the Tigray People’s Liberation Front (TPLF) of perpetrating acts of murder, rape, and looting in Raya Kobo and other populated areas within the Amhara region. It further alleged the involvement of Tigrayan civilians in these acts.

The Oversight Board’s assessment highlighted the lack of supporting evidence. “Despite the user’s claim of relying on prior, unnamed reports and on-the-ground sources, no concrete evidence was presented to substantiate these allegations,” the board stated.

The Dangers of Unverified Claims

The board emphasized the perilous nature of such rumors. “Assertions that implicate an ethnic group in widespread atrocities, as seen in this instance, are inherently dangerous and substantially elevate the risk of immediate violence.”

Initially, Facebook’s automated systems flagged and removed the post. The platform’s Amharic-speaking content review team determined it breached the platform’s guidelines concerning hate speech. However, Facebook subsequently reversed this decision and reinstated the content following an appeal to the Oversight Board.

Oversight Board's Decision and Concerns

The Oversight Board overturned Facebook’s reinstatement of the post. It based its decision on a violation of Facebook’s policies against inciting violence, rather than the hate speech rules initially cited by the platform.

The group voiced significant concern that the dissemination of unverified information in areas experiencing violence, such as Ethiopia, could “result in severe atrocities, mirroring the situation observed in Myanmar.”

Parallels to Myanmar and the Rohingya Crisis

This case echoes concerns raised by a recent lawsuit. A group of Rohingya refugees in the U.S. filed a $150 billion class-action suit against Meta, asserting that Facebook’s introduction to the country was a “critical turning point” in the genocide against the Rohingya people.

Misinformation fueling ethnic violence in Myanmar spread extensively on Facebook, often originating from military sources. This is widely believed to have intensified the violence targeting the country’s Muslim minority and led to widespread displacement.

Frances Haugen's Warnings

Frances Haugen, a former Facebook whistleblower, has repeatedly pointed to algorithmically amplified ethnic violence in countries like Myanmar and Ethiopia – and Meta’s insufficient response – as a major threat. She warned Congress in October that “the events in Myanmar and the current situation in Ethiopia represent only the initial stages of a narrative with a potentially devastating conclusion.”

Recommendations for Meta

The Oversight Board directed Meta to commission an independent human rights assessment. This assessment should focus on Facebook and Instagram’s contribution to escalating the risk of ethnic violence in Ethiopia.

Furthermore, the board requested an evaluation of Meta’s capacity to effectively moderate content in the country’s various languages.

Meta's Response

Last month, Meta defended its existing safety measures in Ethiopia. The company highlighted the expanded application of its rules against misinformation and hate speech.

Meta also stated that it has enhanced its enforcement capabilities in the past two years. It now possesses the ability to review content in Amharic, Oromo, Somali, and Tigrinya – the four most prevalent languages in Ethiopia.

#meta#ethiopia#violence#oversight board#social media#content moderation