Bluesky Moderation Reports Surge in 2024 - 17x Increase

Bluesky’s 2024 Moderation Report: Growth and Increased Challenges
Bluesky released its annual moderation report on Friday, detailing the platform’s substantial user growth throughout 2024 and the corresponding impact on its Trust & Safety team. The report highlights a significant increase in the volume of user reports received.
Report Volume and Key Issues
The most frequent reports submitted by users concerned instances of harassment, trolling, and intolerance. These issues have presented ongoing challenges for Bluesky as its user base expands, occasionally sparking widespread debate regarding specific moderation choices.
The company refrained from providing details regarding actions taken, or not taken, on individual user reports, including those appearing on frequently blocked lists.
User Growth and Team Expansion
Bluesky experienced considerable growth in 2024, adding over 23 million new users. This influx was driven by various factors, including an exodus from X (formerly Twitter).
Several changes at X, such as alterations to its blocking functionality and the utilization of user data for AI training, prompted users to seek alternative platforms. Political shifts and the preferences of X’s owner, Elon Musk, also contributed to user departures. A temporary ban of X in Brazil during September further boosted Bluesky’s user numbers.
To address the increased demands, Bluesky expanded its moderation team to approximately 100 moderators and continues active recruitment. Recognizing the emotional toll of reviewing graphic content, the company also began offering psychological support to its team members.
Report Statistics and Processing Improvements
In 2024, Bluesky received a total of 6.48 million moderation reports, representing a 17-fold increase compared to the 358,000 reports received in 2023.
Beginning this year, Bluesky will allow users to submit moderation reports directly through the app. This will provide users with greater transparency regarding the status of their reports and facilitate easier tracking of updates. The platform also plans to integrate an in-app appeals process.
A surge in users from Brazil in August resulted in a peak of 50,000 daily reports. This led to a processing backlog, prompting Bluesky to hire additional staff proficient in Portuguese, including utilizing a contract vendor.
Automation and Efficiency
Bluesky implemented automation for several report categories, beyond just spam, to manage the increased volume. While this initially resulted in some false positives, it significantly reduced processing times for “high-certainty” accounts to mere “seconds.” Previously, reports typically took around 40 minutes to process.
Human moderators continue to review false positives and handle appeals, ensuring a balance between automation and human oversight.
User Reporting Behavior
In 2024, 4.57% of active Bluesky users (1.19 million) submitted at least one moderation report, a decrease from 5.6% in 2023. The majority of these reports – 3.5 million – targeted individual posts.
Account profiles were reported 47,000 times, often due to inappropriate profile pictures or banner images. Lists received 45,000 reports, while Direct Messages (DMs) were reported 17,700 times. Feeds and Starter Packs garnered 5,300 and 1,900 reports, respectively.
Reports predominantly centered on anti-social conduct, such as trolling and harassment, indicating a user desire for a less toxic social environment compared to X.
Report Category Breakdown
Bluesky categorized reports as follows:
- Misleading content (impersonation, misinformation, or false claims): 1.20 million
- Spam (excessive mentions, replies, or repetitive content): 1.40 million
- Unwanted sexual content (nudity or adult content not properly labeled): 630,000
- Illegal or urgent issues (clear violations of the law or Bluesky’s terms of service): 933,000
- Other (issues not fitting the above categories): 726,000
Labeling and Appeals
Human labelers applied 55,422 “sexual figure” labels, 22,412 “rude” labels, 13,201 “spam” labels, 11,341 “intolerant” labels, and 3,046 “threat” labels to posts and accounts.
A total of 205,000 appeals were submitted by 93,076 users regarding Bluesky’s moderation decisions in 2024.
Account Takedowns and Legal Requests
Moderators initiated 66,308 account takedowns, while automated systems removed 35,842 accounts. Bluesky received 238 requests from law enforcement, governments, and legal firms, responding to 182 and complying with 146. The majority of these requests originated from Germany, the U.S., Brazil, and Japan.
The report also covered trademark, copyright, and child safety concerns. Bluesky submitted 1,154 confirmed CSAM (Child Sexual Abuse Material) reports to the National Center for Missing & Exploited Children (NCMEC).
Related Posts

Google Disco: Build Web Apps from Browser Tabs with Gemini

Spotify's AI Prompted Playlists: Personalized Music is Here

YouTube TV to Offer Genre-Based Plans | Cord Cutter News

Google Tests AI Article Overviews in Google News

Amazon Updates Copyright Protection for Kindle Direct Publishing
