The video sharing application, TikTok recently announced in its transparency update that it has removed a large number of videos between months April and September, 2021. Videos violating the community guidelines by the social media application were exceeding 81.5 million in number, and so TikTok decided to remove it from the platform.
TikTok came up with a new moderating system where all videos showing content including nudity, sexual violence, illegal activities, and minor safety were automatically removed from the app.
In the reports, around 41% of the videos deleted had content violating minor safety policies. The automated system by mistake removed videos that were totally within the limitations. However, such videos were later reinstated. The 5% of the appropriate removed videos had content including profiles with hashtags ‘Black Lives Matter’ or ‘I am a black man’. On the other hand, profiles falling in the restricted content criteria for instance, ‘supporting white supremacy’, were left uncensored.