Latest News

The Tech Oversight Project Issues Statement on Meta’s Cover-Up, Implicated by Whistleblower Testimony


Nov 07, 2023

Read the Wall Street Journal story here.

 

WASHINGTON, DC – Today, the Tech Oversight Project issued the following statement after explosive and damning testimony from former Meta employee Arturo Béjar. Last week, Béjar came forward with a disturbing new whistleblower report revealing that Mark Zuckerberg and Meta’s executive team engaged in a cover-up after being presented with evidence of egregious and obvious harm to minors. The warnings showed that children were exposed to sexual harassment from adults on a widespread basis, and in response, Meta made it more difficult for minors to report abuse and developed their own deceptive metrics to obscure the real scope of harm young people were experiencing.

“Today’s brave and moving testimony from Arturo Béjar proves Meta executives have engaged in a massive cover-up that traps kids in a tortuous hell of exploitation, harassment, bullying, and shame. And the rot goes all the way to the top,” said Sacha Haworth, Executive Director of the Tech Oversight Project. “Mark Zuckerberg, Sheryl Sandberg, and Adam Mosseri have blood on their hands because they decided to prioritize advertising revenue over the lives of their youngest users. This cold-blooded greed is why we cannot trust social media platforms to design safe platforms and protect our children and teens. We need to force them to by passing legislation like the Kids Online Safety Act. The time has come for the Senate to bring this bipartisan bill to the floor for a vote.”

Highlights from Whistleblower Report:

  • Fall 2021, Meta Consultant Arturo Béjar sent an email to Meta CEO Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, Chief Product Officer Chris Cox, and Instagram head Adam Mosseri detailing the harms and rampant sexual harassment young users (1 in 8 on Instagram) were experiencing on Meta’s platforms.  [View the email here.]
  • To discourage users from filing reports like sexual harassment, Meta added additional steps to the reporting process – a dark pattern – internal documents from 2019 show. [View the document here.]
  • The story details that Meta then shifted resources away from monitoring and reviewing reports of unacceptable content like pornography, terrorism, bullying, and excessive gore – now mostly relying on machine-learning models.
  • The shift to machine learning as the source of screening made the problem worse. A data scientist warned Guy Rosen, Facebook’s head of integrity at the time, that Meta’s classifiers were reliable enough to remove only a low single-digit percentage of hate speech with any degree of precision.
  • Meta’s rules didn’t clearly prohibit adults from engaging in sexual harassment of minors, like flooding the comments section on a teenager’s posts with kiss emojis or posting pictures of kids in their underwear, inviting their followers to “see more” in a private Facebook Messenger group.
  • In order to prove to Zuckerberg and Meta’s leadership its reliance on machine-learning models and internal “prevalence” statistics, Béjar and members of the Well-Being team built a new system called “Bad Emotional Experience Feedback.”
  • The effort identified problems with prevalence from the start: Users were 100 times more likely to tell Instagram they’d witnessed bullying in the last week than Meta’s bullying-prevalence statistics indicated they should.
  • To rectify the negative experience Instagram’s young users were experiencing, team members proposed capping the amount of beauty- and fashion-influence content, reconsidering AI-generated “beauty filters,” and building better ways for users to report unwanted contacts – all fell on deaf ears.
  • Instead, following Frances Haugen’s whistleblower report, Meta formalized new rules for employees’ internal communication. Among the mandates for achieving “Narrative Excellence,” as the company called it, was to keep research data tight and never assert a moral or legal duty to fix a problem.
  • Following Béjar’s departure from Meta, he began consulting with a coalition of state attorneys general who filed suit against the company late last month, alleging that the company had built its products to maximize engagement at the expense of young users’ physical and mental health.

Additional Background:

Jump to Content