Scrolling through TikTok’s endless stream of videos, users rarely see the content that never makes it to their screens, especially in Kenya where more than 334,000 videos disappeared between July and September 2024, swiftly removed for violating the platform’s Community Guidelines.

According to TikTok’s latest Community Guidelines Enforcement Report, the majority—88.9 per cent—were taken down before anyone could view them, while 93 per cent were erased within 24 hours of being posted.

A major focus was on Integrity & Authenticity, a category designed to combat misinformation and fraud. TikTok removed 99.7 per cent of flagged videos before users could even report them, reinforcing its efforts to curb misleading content.

Mental health protection was also a priority, with 99.9 per cent of harmful posts under Mental & Behavioural Health removed proactively and 96.4 per cent deleted within a day.

The same vigilance applied to Youth Safety & Well-Being, where 99.7 per cent of policy-violating content was taken down before any views.

Additionally, Sensitive & Mature Themes were strictly monitored, with 99.5 per cent of inappropriate videos erased before being reported and 95.8 per cent removed within 24 hours.

Beyond Kenya, TikTok’s global enforcement was just as stringent.

The platform deleted over 147 million videos in the same period, with 118 million automatically flagged by its systems.

This pushed its proactive detection rate to 98.2 per cent, reflecting its heavy investment in AI-driven content moderation.

TikTok continues to adjust its enforcement strategies to suit Kenya’s digital landscape, reinforcing its safety measures through advanced technology, trust and safety teams, and local partnerships.

As the platform expands, it remains focused on ensuring that users can create, share, and interact in an environment free from harmful content.