in expanded crackdown, facebook increases penalties for rule-breaking groups and their members

Facebook Enhances Penalties for Groups Violating Policies
Facebook announced this morning an escalation in repercussions for Facebook Groups and their members that contravene platform rules. These changes are intended to diminish the visibility of potentially detrimental content originating from groups.
The company will now exclude civic and political groups from recommendations in regions outside of the United States. Furthermore, the reach of groups and individual members consistently breaching its guidelines will be restricted.
A Gradual Crackdown on Harmful Content
These adjustments represent a continuation of Facebook’s ongoing, though often slow and inconsistent, efforts to combat harmful, divisive, or dangerous content shared within Facebook Groups.
Prior to the U.S. elections, Facebook implemented new regulations to penalize those who violated its Community Standards or disseminated misinformation through Groups. Responsibility was shifted towards Group administrators, and individual rule-breakers faced consequences.
Expanding Restrictions and Policy Evolution
Facebook also ceased recommending health groups, directing users to authoritative sources for health information, including updates regarding Covid-19. In January, a more substantial action was taken against potentially dangerous groups.
The platform announced the removal of civic and political groups, as well as newly established groups, from recommendations within the U.S., following the events at the U.S. Capitol on January 6, 2021. Previously, these groups had experienced temporary limitations leading up to the elections.
Internal Research Highlighted Polarization
As reported by The WSJ, Facebook’s internal studies revealed that U.S. Facebook groups were contributing to user polarization and fueling calls for violence following the elections. Approximately 70% of the 100 most active civic groups in the U.S. exhibited issues like hate speech, misinformation, bullying, and harassment, rendering them unsuitable for recommendation.
This policy, initially implemented in the U.S., is now being extended to Facebook’s global user base.
Impact on Group Discovery
Consequently, users worldwide will no longer receive recommendations for civic or political groups while browsing Facebook. It’s crucial to understand, however, that recommendations represent only one avenue for discovering Facebook Groups.
Users can still locate groups through search, shared links, invitations, and private messages from friends.
Downranking and Increased Enforcement
Groups with a history of violating Facebook’s rules will now appear lower in recommendations – a penalty frequently employed to reduce the visibility of News Feed content. The company will also intensify penalties for rule-violating groups and their members through various enforcement measures.
Warning Messages and Content Visibility
For instance, users attempting to join groups with a history of Community Standards violations will receive a warning message detailing the group’s infractions, potentially discouraging them from joining.
Rule-violating groups will experience limitations on invite notifications, and existing members will see reduced visibility of the group’s content in their News Feeds. These groups will also be demoted in Facebook’s recommendations.
Temporary Post Approval and Group Shutdowns
Groups with a significant number of members who have violated Facebook policies or participated in groups that were shut down for violations will be required to temporarily approve all new member posts. If administrators or moderators repeatedly approve rule-breaking content, Facebook will remove the entire group.
This measure addresses the issue of groups reforming after being banned, only to resume their problematic behavior.
Penalties for Repeat Offenders
The final change announced today concerns individual group members.
Users with repeated violations within Facebook Groups will be temporarily prohibited from posting or commenting in any group, inviting others to join groups, or creating new groups. Facebook states this aims to curtail the reach of malicious actors.
Transparency and Addressing Bias Concerns
The new policies provide Facebook with a means to document a group’s problematic behavior leading to its shutdown. This “paper trail” also assists Facebook in countering accusations of bias in its enforcement actions – a frequent criticism from Facebook critics.
Limitations and Ongoing Challenges
However, these policies are often perceived as lenient penalties for rule violations – akin to what users colloquially refer to as “Facebook jail.” Temporary restrictions on interaction or features are now being adapted for Facebook Groups and their members.
Further challenges remain. Effective enforcement of these rules is uncertain, and the policies do not address one of the primary methods of group discovery: search. While Facebook claims to downrank low-quality results in search, the effectiveness of these efforts is debatable.
Despite banning QAnon content across its platform, QAnon-related content remains discoverable through search, often disguised within groups that do not explicitly identify as QAnon-affiliated.
Search Results and Misinformation
Searches for terms like “antivax” or “covid hoax” can also lead users to problematic groups, such as those claiming to be “not anti-vax in general, but just anti-RNA.”
These groups, while not official health resources, are easily accessible through Facebook search.
Potential for Stronger Technical Measures
Facebook possesses the technical capability to implement more robust content blocking measures. The platform banned “stop the steal” and related conspiracies following the U.S. elections, resulting in no search results for those terms.
The question arises: why does a search for a banned topic like “QAnon” yield any results at all?
Expanding Block Lists and URL Prevention
Why should searches for “covid hoax” direct users to problematic groups?
Facebook could broaden its list of problematic search terms and return blank pages for other types of harmful content. It could also maintain a block list of URLs known to spread false information, preventing users from re-sharing posts containing those links. Posts could be defaulted to non-public, and repeat violators could have their posts permanently set to non-public.
In essence, Facebook could take more decisive action to combat misinformation, toxicity, and harmful content. However, it continues to adopt incremental changes and temporary punishments, which may not be sufficient.
Sarah Perez
Sarah Perez: A TechCrunch Reporter's Background
Sarah Perez has been a dedicated reporter for TechCrunch since August 2011. Her journalistic career at the publication has been marked by consistent coverage of the technology landscape.
Before joining TechCrunch, Sarah contributed to ReadWriteWeb for more than three years. This prior experience provided a strong foundation for her subsequent work.
Early Career and Industry Experience
Sarah’s professional background extends beyond journalism. She previously held positions in Information Technology, gaining experience across diverse sectors.
Her I.T. career encompassed roles within the banking, retail, and software industries. This varied experience offers a unique perspective to her reporting.
Contacting Sarah Perez
For inquiries or to verify communications originating from Sarah, she can be reached via email. Her official TechCrunch email address is sarahp@techcrunch.com.
Alternatively, secure communication can be established through Signal. Her Signal username is sarahperez.01, allowing for encrypted messaging.