TikTok has announced it will introduce a set of new updates to its community guidelines to promote the safety, security, and integrity of its platform.
The new updates will be implemented over the next few weeks and will target safety concerns around dangerous acts, eating disorders, harmful ideologies, and overall platform security.
Community feedback, recommendations from TikTok’s Safety Advisory Council and advice from experts in areas like digital safety and security, content moderation, health and wellbeing, and adolescent development have prompted these upcoming changes.
The first of the major updates will be to their ‘dangerous acts and challenges policy,’ which will aim to prevent the spread of suicide hoaxes, an issue that previously sat within the platforms suicide and self-harm policies.
Additionally, the social media platform will be working with experts to launch new videos from various creators that call on their community to follow four steps when assessing content online – stop, think, decide and act.
Community members can view these videos at their #SaferTogether hub on the Discover page over the next week.
TikTok will also extend its policies around eating disorders – broadening its current process of removing content that promotes eating disorders to also removing content that promotes disordered eating, such as over-exercise and short term fasting. In addition, the platform will be working with eating disorders experts, researchers, and physicians to address this.
Another update to their guidelines includes clarifying the types of hateful ideologies prohibited on their platform. In their update, they referenced ‘deadnaming, misgendering, or misogyny as well as content that supports or promotes conversion therapy programs.’
Lastly, the platform will expand its policy to ‘protect the security, integrity, availability, and reliability of our platform,’ by prohibiting unauthorised access to TikTok, as well as banning the use of TikTok to perpetrate criminal activity.
The company released its newest Community Guidelines Enforcement Report Tuesday, covering the third quarter of 2021. The report revealed that more than 91 million videos were removed from the platform during the period, or roughly 1% of all videos uploaded.