TikTok has taken down more than 25.4 million videos in Pakistan between April and June 2025 for violating its Community Guidelines, according to the platform’s latest transparency report. The move reflects the company’s ongoing efforts to make the app safer and to curb the spread of harmful or misleading content.
What’s striking is that 99.7% of these videos were identified and removed proactively, before any user even reported them. Thanks to TikTok’s advanced detection systems, most of the content was taken down within 24 hours of being uploaded, showing how fast the platform now acts against rule-breaking material.
On a global scale, TikTok removed nearly 190 million videos during the same three-month period a clear indication of how seriously the company is treating content moderation across the world.
The report highlights misinformation as the leading reason for removals, accounting for about 45% of the total. This includes everything from false news stories to misleading health-related claims. Following that were sensitive and harmful themes, such as hate speech or graphic visuals, making up around 30.6%, while safety-related concerns including content that promotes self-harm or dangerous challenges represented another 14%.
In addition to content removals, TikTok also deleted over 102 million fake or underage accounts, reinforcing its efforts to maintain authenticity and comply with age regulations.
These large-scale actions underline TikTok’s push toward building a safer and more trustworthy online community. While no platform can completely eliminate harmful content, the speed and scope of these measures suggest that TikTok is making steady progress.
For both creators and everyday users, it ultimately means a cleaner, more positive environment one where genuine creativity can thrive without being overshadowed by misleading or unsafe material.