TikTok took down more than 49 million videos from users across the globe during the second half of 2019, the company revealed in a transparency report released this morning.
The videos represented less than 1 percent of all videos uploaded to TikTok during that time period. TikTok says they were pulled for violations of either the app’s community guidelines or terms of service. In December, the company says a quarter of content removals were for “adult nudity and sexual activities.” Another quarter of the videos were removed for “depicting harmful, dangerous, or illegal behavior by minors,” such as drug use. Harassment and hate speech made up a small number of the takedowns, just 3 percent and 1 percent of the total videos respectively.
More than 16 million of the pulled videos came from users in India. The second biggest market for pulled videos was the United States, with 4.6 million videos pulled.
These are huge numbers compared to YouTube. YouTube reported removing around 14.7 million videos during the same time period in 2019. The United States and India were also YouTube’s top markets for video removals.
TikTok’s video removals by and large do not stem from government requests or copyright complaints. TikTok says it only received around 1,300 copyright removal requests and 45 government takedown requests (mostly from India), and it did not comply with all of those requests. Like the first half of 2019, the report indicates that TikTok did not receive any takedown requests or user information requests from China, where its parent company, Bytedance, is based.
“We do not and have not removed any content at the request of the Chinese government, and would not do so if asked,” a TikTok spokesperson said. The spokesperson also said TikTok has “never provided user data to the Chinese government, nor would we do so if asked.”
The details come from TikTok’s second transparency report, which covers from July 1st, 2019 through December 31st, 2019. The company released its first transparency report, covering the first half of 2019, in late December.
TikTok has been under great scrutiny from US politicians concerned about the immense popularity of an app owned by a Chinese company. Earlier this week, Secretary of State Mike Pompeo even floated the idea of banning TikTok over concerns that it could funnel users’ private information to the Chinese government. TikTok has denied ever providing information to China.
The transparency reports are in part meant to help build trust in TikTok. This report is the first to expand on TikTok’s moderation policies, and TikTok says it’ll offer more detailed information in the future. Late last year, the company switched to a “new content moderation infrastructure” that catalogs specific reasons for video removals, like those shared for December.
Moderation across social platforms has been a huge area of concern in recent years, as platforms like Facebook and Twitter have struggled to deal with hate speech, harassment, and misinformation. TikTok’s transparency report seems to suggest that TikTok is mostly dealing with other issues — or, at least, those other issues like nudity are the bulk of what it’s spotting — and that it’s generally catching problematic videos ahead of time. The company says it automatically caught more than 98 percent of removed videos without them being reported; close to 90 percent of those videos received no views.