TikTok has decided to be more transparent with its users by telling them the reason behind video removals on the platform.
When a video is removed, the video’s creator will be notified about which of TikTok’s policies they violated and be given the opportunity to appeal the decision. Any content which is flagged as self-harm or suicide-related will also trigger the app to provide the creator with access to expert resources through a second notification.
TikTok has been experimenting with the new notification system for the ‘past few months’ and says it has helped to reduce the number of repeat policy infringements and almost tripled visits to its community guidelines. The ByteDance-owned app has also seen a 14 per cent reduction in appeal requests.
“Being transparent with our community is key to continuing to earn and maintain trust,” said TikTok in a blog post. “We're glad to be able to bring this new notification system to all our users, and we'll keep working to improve the ways we help our community understand our policies as we continue to build a safe and supportive platform.”