facebook content moderators demand safer working conditions

Over 200 Facebook content moderators, alongside some full-time staff,* are requesting that the technology company cease putting their well-being at risk, as expressed in an open letter addressed to Facebook and its content moderation management companies, Accenture and Covalen. This action follows a report by The Intercept detailing how certain Facebook content moderators – who handle disturbing material such as sexual abuse and violent imagery – were required to return to office work during the pandemic. A Facebook content moderator subsequently tested positive for COVID-19 shortly after this return.
The group stated, “Having permitted content moderators to work remotely for several months, and under considerable pressure to maintain a Facebook platform free from harmful and misleading information, you have mandated our return to the office.” They further noted that moderators who provided medical documentation regarding personal COVID-19 risks were excused from in-person attendance.[1] However, those with family members vulnerable to severe illness from COVID-19 were not granted the same consideration.
The moderators are now requesting that Facebook permit individuals identified as high-risk, or those living with high-risk individuals, to continue working from home without time limitations. In addition, they generally desire Facebook to maximize remote work opportunities for its workforce.
“You have previously stated that content moderation cannot be effectively performed remotely due to security concerns,” they explained. “If this is the case, a fundamental restructuring of the work process is necessary. A culture of unnecessary secrecy currently exists within Facebook. While some content, such as that pertaining to criminal activity, may require moderation within Facebook offices, the remainder should be completed remotely.”
They are also seeking hazard pay, comprehensive healthcare including psychiatric support, and direct employment by Facebook rather than through outsourcing arrangements.
A Facebook spokesperson communicated to TechCrunch in a statement, “We recognize the valuable contributions of our content reviewers and prioritize their health and safety.” The spokesperson continued, “While we value open internal communication, these discussions must be based on factual information. The majority of our 15,000 global content reviewers are currently working from home and will continue to do so throughout the pandemic. All reviewers have access to healthcare and confidential wellbeing resources from their initial day of employment, and Facebook has consistently surpassed health guidelines in ensuring the safety of its facilities for any required in-office work.”
Update 11/19: Facebook has clarified that it is “unable to direct certain highly sensitive and graphic content to outsourced reviewers working from home,” as stated by its VP of Integrity, Guy Rosen, during a press conference. “This type of content is extremely sensitive and not appropriate for review in a home environment with family members present.”
The moderators contend in their letter that Facebook’s automated systems are currently insufficient for effective content moderation. They assert that they are essential to the functioning of Facebook.
“Without our efforts, Facebook would be unusable,” the moderators wrote. “Its entire operation would be compromised. Your algorithms are unable to recognize satire. They cannot distinguish legitimate journalism from disinformation. They cannot react quickly enough to instances of self-harm or child abuse. We are capable of doing so.”
This group represents content moderators across the U.S. and Europe and is supported by the legal advocacy firm Foxglove. Foxglove announced via Twitter that this represents the “largest collective international effort by Facebook content moderators to date.”
This article has been updated to reflect that full-time Facebook employees are also advocating for these changes in support of the content moderators.