Whistleblowers Reveal How Meta, TikTok Cut Safety to Chase Engagement


TL;DR

  • Whistleblower Allegations: More than a dozen former Meta and TikTok employees told the BBC that both companies deliberately weakened content moderation to chase engagement.
  • Safety Trade-offs: Meta assigned 700 staff to grow Reels while refusing key child-protection and election integrity hires, according to the investigation.
  • Child Safety Failures: TikTok’s internal dashboards prioritised cases involving political figures over reports of harm affecting teenagers, former staff revealed.
  • Regulatory Response: The European Commission has opened proceedings against both companies for breaching Digital Services Act transparency rules.

More than a dozen former employees from Meta and TikTok told the BBC that both companies deliberately weakened content moderation to compete for users during the short-form video boom.

Internal documents show the platforms prioritised engagement metrics over protections against violence, sexual exploitation of minors, and extremist content. Both platforms collectively reach more than three billion users. Meta denied the allegations. “Any suggestion that we deliberately amplify harmful content for financial gain is wrong,” a spokesperson said.

Revealed in the BBC documentary “Inside the Rage Machine,” the investigation draws on testimony from current and former staff at both companies. Among its sharper revelations: TikTok’s internal moderation dashboards showed a political figure mocked by comparison to a chicken received higher priority than a 16-year-old in Iraq reporting sexualised images of herself. Former employees say the disparity reflected deliberate policy choices, not oversight.

Meta’s Engagement Race

At Meta, competitive pressure to match TikTok triggered decisions that insiders say directly eroded safety. Meta launched Instagram Reels in 2020 as its direct competitor to TikTok. According to the BBC investigation, the company assigned 700 staff to grow Reels while safety teams were refused just two specialist child-protection roles and 10 additional positions for election integrity.

Matt Motyl, a former senior Meta researcher who ran experiments on hundreds of millions of users between 2019 and 2023, warned that when safety goes wrong at a platform serving billions, the consequences are severe.