Pay-to-Play: Meta's Community (double) Standards on Pornographic Ads

Our investigation reveals Meta approves and distributes pornographic ads while removing identical content from regular user posts

08-01-2025

Project overview

Report cover

Our investigation exposes a significant double standard in Meta's content moderation practices. Despite Meta's strict Community Standards prohibiting adult content, we identified over 3,000 pornographic ads that were approved and distributed through Meta's advertising system in the past year.

Key findings reveal:

  • Meta's algorithms successfully identify and remove sexually explicit content when posted from regular user accounts, yet approve and distribute identical content when submitted through their advertisement system
  • These pornographic ads generated over 8 million impressions in the EU last year
  • Many ads featured AI-generated pornographic media including images, video, and audio
  • Some ads contained celebrity deepfakes, including of French actor Vincent Cassel
  • Ads promoted dubious sexual enhancement products and hook-up websites

Evidence suggests that this isn't a temporary bug — Meta has the technology to detect pornographic content but selectively applies this capability, exempting paid advertisers from the same rules regular users must follow.

AI Forensics is proud to be trusted by