facebook’s self-styled ‘oversight’ board selects first cases, most dealing with hate speech

A body established and financed by Facebook, intended to create distance between the technology company and challenging content moderation decisions, has revealed the initial set of cases it will review.
According to a press release published on its website, the Facebook Oversight Board (FOB) evaluated over 20,000 submissions before selecting six cases for consideration, one of which was directly submitted by Facebook itself.
The six cases selected for initial review are:
Facebook submission: 2020-006-FB-FBR
This case originates from France, involving a user’s post within a COVID-19 Facebook group. The post, consisting of a video and accompanying text, concerned assertions regarding the French agency responsible for regulating health products, specifically allegations of refusing authorization for hydroxychloroquine and azithromycin while approving promotional material for remdesivir. The user also expressed criticism of France’s public health strategy and referenced “[Didier] Raoult’s cure” as a successful treatment elsewhere. Facebook removed the content, citing a violation of its policy regarding violence and incitement. The video in question received at least 50,000 views and 1,000 shares.
Facebook explained in its referral that this case exemplifies the difficulties encountered when addressing the potential for real-world harm stemming from misinformation related to the COVID-19 pandemic.
User submissions:
Of the five cases submitted by users that the FOB has chosen to examine, the majority—three in total—pertain to the removal of content flagged as hate speech.
One case concerns Facebook’s policies on nudity and adult content, while another relates to its guidelines regarding dangerous individuals and organizations.
The Board’s descriptions of the five user-submitted cases are provided below:
- 2020-001-FB-UA: A user shared a screenshot of two tweets from former Malaysian Prime Minister, Dr Mahathir Mohamad, in which the former Prime Minister stated that “Muslims have a right to be angry and kill millions of French people for the massacres of the past” and “[b]ut by and large the Muslims have not applied the ‘eye for an eye’ law. Muslims don’t. The French shouldn’t. Instead the French should teach their people to respect other people’s feelings.” The user did not include a caption with the screenshots. Facebook removed the post for violating its policy on hate speech. The user appealed to the Oversight Board, stating their intention was to raise awareness of the former Prime Minister’s statements.
2020-002-FB-UA: A user posted two widely recognized photographs of a deceased child lying on a beach. The accompanying text, written in Burmese, questioned why there was no response to China’s treatment of Uyghur Muslims, contrasting it with recent events in France involving cartoons. The post also mentioned the Syrian refugee crisis. Facebook removed the content for violating its hate speech policy. The user’s appeal to the Oversight Board indicated the post was intended to challenge those who supported the perpetrator and to emphasize the value of human life over religious beliefs.
2020-003-FB-UA: A user posted alleged historical photographs of churches in Baku, Azerbaijan, accompanied by text claiming Baku was founded by Armenians and questioning the fate of those churches. The user asserted that Armenians were restoring mosques on their land, while accusing others of destroying churches and lacking historical roots. The user expressed opposition to “Azerbaijani aggression” and “vandalism.” The content was removed for violating Facebook’s hate speech policy. The user explained in their appeal that their intention was to highlight the destruction of cultural and religious monuments.
2020-004-IG-UA: A user in Brazil posted a picture on Instagram, intending to raise awareness about the signs of breast cancer. The picture contained eight photographs illustrating breast cancer symptoms, each with an explanation. Five of the photographs showed visible female nipples, while the remaining three depicted female breasts with nipples either out of frame or covered by a hand. Facebook removed the post for violating its policy on adult nudity and sexual activity. The post featured a pink background, and the user informed the Oversight Board that it was shared as part of the national “Pink October” campaign for breast cancer prevention.
2020-005-FB-UA: A user in the US was prompted by Facebook’s “On This Day” feature to reshare a post from two years prior. The user reshared the content. The post, written in English, contains an alleged quote from Joseph Goebbels, the Reich Minister of Propaganda in Nazi Germany, emphasizing the importance of appealing to emotions and instincts over intellect and dismissing the significance of truth. Facebook removed the content for violating its policy on dangerous individuals and organisations. The user appealed to the Oversight Board, arguing that the quote was relevant due to their belief that the current US presidency was exhibiting fascist tendencies.
The FOB’s website allows for public comments on these cases, but submissions will only be accepted for a period of seven days, concluding at 8:00 Eastern Standard Time on Tuesday, December 8, 2020.
The FOB anticipates reaching a decision on each case—and having Facebook implement that decision—within 90 days. Consequently, the first outcomes from the FOB, which began reviewing cases in October, are unlikely to be available before 2021.
Panels consisting of five FOB members—including at least one member from the region affected by the content—will determine whether the content in question should remain removed or be reinstated.
Facebook’s delegation of a limited number of content moderation considerations to its so-called ‘Oversight Board’ has drawn considerable criticism, including the emergence of a parallel, unofficial organization calling itself the Real Oversight Board, and a degree of skepticism.
This skepticism is fueled by the fact that the board is entirely funded by Facebook, structured according to Facebook’s design, and comprised of members selected through a process established by Facebook.
If substantial change is the goal, the FOB is unlikely to deliver it.
Furthermore, the entity lacks the authority to modify Facebook’s policies; it can only offer recommendations that Facebook is free to disregard.
Its scope does not extend to investigating how Facebook’s business model, focused on maximizing user attention, influences the content that is amplified or suppressed by its algorithms.
The board’s concentration on content removals—rather than content that is already permitted on the platform—limits its perspective, as has been previously noted.
Therefore, the board will not be examining why hate groups continue to thrive and recruit on Facebook, or thoroughly investigating the extent to which its algorithmic amplification has supported the anti-vaccination movement. By its very nature, the FOB addresses symptoms, not the fundamental issues within Facebook itself. Outsourcing a small portion of content moderation decisions cannot represent anything more.
Through this Facebook-sponsored display of accountability, the tech giant hopes to generate positive publicity—centered on specific and ‘complex’ content decisions—diverting attention from more direct and challenging questions about the exploitative and harmful aspects of Facebook’s business practices and the legality of its extensive surveillance of Internet users, as legislators worldwide seek to regulate technology companies.
The company aims for the FOB to reframe the debate surrounding culture wars (and related issues) that Facebook’s business model exacerbates as a societal problem—proposing a self-serving ‘solution’ for algorithmically driven societal division in the form of a select group of experts offering opinions on individual content pieces, allowing it to continue shaping the attention economy on a global scale.