The social network TikTok disclosed the scale of content moderation in the third quarter of 2023, noting that 136 million videos were removed because they violated platform rules. The disclosure appeared in the official gazette published by the company, detailing the quarterly enforcement actions and outcomes for policy compliance. The figure of 136 million represents just under 1 percent of all videos uploaded between July and October of that year, underscoring the platform’s ongoing commitment to policing content and maintaining safety standards for users. When compared with the prior quarter, the number of removed videos rose by roughly 30 percent, highlighting a marked increase in enforcement activity from June through September 2023. In that earlier period, 106 million clips were taken down for breaches of the rules.
The report identifies a record quarter for video removals in the platform’s history during the third quarter of 2023, surpassing the previous peak reached in the second quarter of 2022, when just over 112 million videos were deleted. This record reflects intensified moderation efforts and evolving enforcement thresholds as the platform grapples with a broad spectrum of policy violations. The breakdown shows that videos involving sensitive and adult content accounted for the largest share of deletions, approaching 39.8 percent. Advertisements that promote or depict regulated products and services, including alcohol, tobacco, drugs, and gambling, were responsible for about 26.3 percent of removals. Additional categories included violence and criminal activity at 14 percent, mental health content at 10.8 percent, privacy violations at 7.8 percent, and attempts by users to impersonate others at 1.3 percent. These figures illustrate the breadth of content types the platform monitors and the varied reasons videos may be removed.
The report also notes that moderators restored a little over 7 million videos that had been deleted during initial reviews, reflecting the ongoing governance process that allows reconsideration of moderation decisions. This restoration activity indicates a commitment to accuracy and fairness in content moderation, ensuring that actions are aligned with policy interpretations and updated as needed. The overall picture conveyed by the quarterly data is one of rigorous enforcement paired with mechanisms for review and correction where warranted.
Recent developments include regulatory actions in certain jurisdictions that have influenced access to TikTok, illustrating how policy and governance dynamics can shape platform availability as part of broader digital safety and information controls. The quarterly findings thus serve as a useful snapshot for stakeholders seeking insight into how content standards are applied across a large, global user base and how moderation policies evolve in response to emerging trends and concerns.