Meta acknowledges moderation failures and frequent errors with content and account removals

The news: Meta said its content moderation practices are being over-enforced and are obstructing free speech.

  • Innocent or innocuous posts on its platforms, including Facebook, Threads, and Instagram, are being erroneously removed, the company acknowledged.
  • “When enforcing our policies, our error rates are too high, … harmless content gets taken down or restricted, and too many people get penalized unfairly,” Meta wrote in a blog post.

Meta’s concession comes after Threads and Instagram users’ accounts were wrongly deactivated or restricted for sharing content on controversial topics or mistakenly marked as being under the platforms’ minimum age requirements.

Deepfake threats: Meta, TikTok, Snap, X, and other major social platforms geared up for an influx of AI-generated misinformation ahead of the US election, but that threat may not have affected Meta users as deeply as was feared.

  • Meta stated that during this year’s election season, AI content represented less than 1% of all political, election, and social misinformation.
  • It added that 20 covert influence networks Meta took down globally struggled to build audiences online.

What’s next? Nick Clegg, Meta’s president of global affairs, called its policies a “living, breathing document” but didn’t clarify how content moderation operations will change going forward.

  • Meta is unlikely to entirely revamp its policies but will instead find ways to fine-tune how it manages posts and user security.
  • Replacing automated moderation with more human moderators could help reduce its error rate but could also be costly.

Political influence: Meta’s statements come a month before the US presidential transition. President-elect Donald Trump has repeatedly criticized moderation policies on social media platforms.

  • On Monday, Brendan Carr, Trump’s pick to run the Federal Communications Commission (FCC), said he will “smash the censorship cartel” of social media platforms.
  • Meta’s renewed interest in refining content moderation practices could be an attempt to align the company with the incoming administration.

Our take: Meta’s incremental adjustments to improve moderation could ease tension with frustrated users, and its approach may reflect the challenge of balancing automation and accountability.

This article is part of EMARKETER’s client-only subscription Briefings—daily newsletters authored by industry analysts who are experts in marketing, advertising, media, and tech trends. To help you finish 2024 strong, and start 2025 off on the right foot, articles like this one—delivering the latest news and insights—are completely free through January 31, 2025. If you want to learn how to get insights like these delivered to your inbox every day, and get access to our data-driven forecasts, reports, and industry benchmarks, schedule a demo with our sales team.