In its Q2 report of Community Guidelines Enforcement, TikTok
has shared its data with regard to the number of videos and accounts that
it deleted and took action against in the second quarter of this year.
There has been a constant rise in the number of videos on TikTok that have been removed due to policy violations. TikTok revealed that it had to remove over 113 million videos between April and June this year, which is an 11% increase from the previous quarter.
The company boasted improvement in the proactive removal of videos,
which is taking down of problematic content before it is viewed by anyone. An increase
of proactive removal from 83.6% in Q1 to 89.1% in Q2 was observed, while removals
in under 24 hours improved from 71.9% to 83.9%.
More specifically, nudity and sexual activity involving minors was found to be a major reason for the rise in removal of video clips. What’s concerning, however, is that TikTok actually incentivizes such content, although it claims protection of minors on the other hand.
The spread of fake accounts is the primary reason why TikTok
has had to remove 62% more accounts off of its platform as compared to Q1. Of course,
this has to do with the rising popularity of the app; the more TikTok accounts
are created, the greater the numbers of fake accounts.
With regard to these statistics, TikTok has taken into consideration
the importance of implementing better systems, including technology-based
flagging, moderation, and fact-checking. The company claims having “more than a
dozen fact-checking partners around the world that review content in over 30
languages.” The new proactive fact-checking program is efficient in flagging
new and evolving claims spreading across the internet, hence improving the process
of acting against violations. “Since starting this program last quarter, we
identified 33 new misinformation claims, resulting in the removal of 58,000
videos from the platform,” says TikTok.