tiktok expands community guidelines, rolls out new ‘well-being’ features

TikTok announced a revised set of Community Guidelines today, designed to reinforce its current policies concerning issues such as harassment, risky activities, self-injury, and violent content. Alongside these updates, the platform is introducing four new features specifically intended to support the well-being of its user base. These additions include expanded resources for individuals experiencing challenges with self-harm or suicidal thoughts, optional viewing screens to shield users from potentially disturbing content, a text-to-speech function to enhance accessibility, and a broadened collection of information related to COVID-19.
The company explained that while many of these subjects were already addressed in TikTok’s Community Guidelines, the current changes provide greater detail in each area, informed by observed platform activity, user feedback, and insights from experts including academics, civil society groups, and TikTok’s Content Advisory Council.
Updates to the guidelines regarding suicide and self-harm now reflect input and terminology recommended by mental health professionals, aiming to avoid the normalization of self-injurious actions. Specifically, the policy concerning content related to eating disorders has been refined to prohibit the promotion or glorification of dangerous weight loss practices.
Strengthened policies addressing bullying and harassment now offer a more comprehensive description of unacceptable content and behaviors on TikTok, encompassing doxxing, cyberstalking, and an expanded definition of sexual harassment. This is particularly relevant considering instances where TikTok users have identified the workplaces of individuals expressing views on controversial topics – in some cases leading to employment consequences. The approach TikTok will take to address such “doxxing” instances remains to be seen, as some cases have not involved the publication of private home addresses, but rather the notification of an employer.The guidelines concerning dangerous acts have also been expanded to more clearly define, label, or remove content that depicts hazardous activities and challenges. A new “harmful activities” section within the minor safety policy reiterates the prohibition of content that encourages dangerous dares, games, or actions that could compromise the safety of young people.
TikTok has also revised its policy on dangerous individuals and organizations to concentrate on the issue of violent extremism. The updated guidelines provide a more detailed explanation of what constitutes a threat or incitement to violence, and the types of content that will be prohibited. This update is particularly timely, given recent expressions of support for violence or civil unrest by some individuals following the U.S. presidential election.
Regarding new features, TikTok collaborated with behavioral psychologists and suicide prevention specialists – including Providence, Samaritans of Singapore, and members of its U.S. Content Advisory Council – to develop new resources offering evidence-based support for users searching for information related to self-harm. These resources will be displayed when users search for terms such as “selfharm” or “hatemyself.” Continued access to the National Suicide Prevention Lifeline and Crisis Text Line will remain available for immediate assistance.
TikTok will also introduce optional viewing screens that will appear before videos containing content some may find graphic or upsetting. While such videos are already excluded from the For You feed, they may not necessarily be removed entirely. These screens might cover depictions of violence or conflict presented for documentary purposes, natural animal behavior that some may find disturbing, or other potentially frightening content like horror film clips.
When disturbing content is identified through user reports, TikTok will apply these screens to the videos, giving users the option to “skip video” or “watch anyway.”
Furthermore, a new text-to-voice feature has been added to improve accessibility, allowing users to convert typed text into spoken audio within their videos. This follows the recent introduction of a feature designed to support individuals with photosensitive epilepsy.
TikTok is also incorporating questions and answers about COVID-19 vaccines into its in-app coronavirus resource hub. This information, provided by public health authorities such as the Centers for Disease Control (CDC), will be accessible from the Discover page, search results, and banners on videos related to COVID-19 and vaccines. The company reports that its COVID-19 hub has already received over 2 billion views in the past six months. TikTok is also partnering with Team Halo, enabling scientists worldwide to share updates on vaccine progress through video content.TikTok has demonstrated a proactive approach to content moderation on its platform. Users frequently report videos being removed for violating platform policies, and instances of users re-uploading deleted content to respond to it are common. The platform also swiftly addressed the spread of misinformation surrounding the recent U.S. election by blocking relevant hashtags like #RiggedElection and #SharpieGate.
The newly released policies also address more recent user behaviors, such as calls for violence following the election.
“Keeping our community safe is a commitment with no finish line,” TikTok stated in its announcement regarding these updates. “We recognize the responsibility we have to our users to be nimble in our detection and response when new kinds of content and behaviors emerge. To that end, we’ll keep advancing our policies, developing technology to automatically detect violative content, building features that help people manage their TikTok presence and content choices, and empowering our community to help us foster a trustworthy environment. Ultimately, we hope these updates enable people to have a positive and meaningful TikTok experience,” the company concluded.