Pay-to-Play: Meta's Community (double) Standards on Pornographic Ads

Exposing Pornographic Ads on Meta Despite Content Moderation Claims

project overview

Pay-to-Play: Meta’s Community (Double) Standards on Pornographic Ads report uncovers a glaring double standard in Meta’s content moderation practices. Despite the platform’s strict Community Standards, over 3,000 pornographic advertisements featuring explicit adult content have been approved and distributed through the Meta Advertisement in the past year.

  • The same pornographic visuals get removed by Meta when uploaded from a user account. Meta's algorithms can identify sexually explicit content and remove it when posted from a user account, yet the same content is approved and actively distributed to millions of users when it goes through their advertisement system.
  • The ads promoted dubious sexual enhancement products and hook up websites. The ads were also illustrated with AI-generated media such as audio, image, video, and celebrity deepfake of French actor Vincent Cassel.
  • The ads generated over 8 million impressions in the European Union last year, targeting a predominantly 44+-year-old male demographic including with incestious pornographic Whatsapp conversations and images.

Our findings suggest that although Meta has the technology to automatically detect pornographic content, it does not use it to enforce its community standards on advertisements as it does for non-sponsored content.  This double standard is not a temporary bug but has persisted since as early as, at least, December 2023. This challenges the statements Instagram and Facebook Risk Assessments Reports mandated by the DSA, as it claims a “proactive review" of advertisements to enforce the platform's Advertising Standards but largely fails to do so.