TikTok removed over 450,000 videos and banned 43,000 accounts in Kenya in Q1 2025.
TikTok has removed more than 450,000 videos and banned over 43,000 accounts in Kenya between January and March 2025 as part of its ongoing efforts to ensure safety on the platform.
According to its latest Community Guidelines Enforcement Report, the short-form video platform says 92.1% of the removed videos were pulled before anyone viewed them. Additionally, 94.3% were deleted within 24 hours of being posted.
“By integrating advanced automated moderation technologies with the expertise of thousands of trust and safety professionals, TikTok enables faster and consistent removal of content that violates our Community Guidelines,” the company said.
Globally, TikTok has achieved a 99% proactive detection rate, meaning most harmful content is flagged and removed before users see it. This includes content related to hate speech, misinformation, and abuse.
The platform also revealed that over 19 million LIVE rooms were shut down globally during the same period—a 50% increase compared to the previous quarter.
TikTok has intensified its LIVE Monetisation Guidelines to clarify what content can earn revenue, aiming to prevent abuse of the system.
Beyond moderation, TikTok has taken steps to support users’ mental health in Kenya. The app now includes direct in-app access to Childline Kenya’s helpline, allowing users to report sensitive issues like suicide, harassment, and self-harm.
The company has also teamed up with Mental360 to develop locally relevant mental health content based on evidence and research.
“This comes at a critical time in Kenya, where there is a growing need to bring mental health resources closer to those who need it the most, especially online,” TikTok said.





