Press Releases

RESOURCE GUIDE: What to Know: New Facebook Whistleblower Testifies in Senate Judiciary Committee


Nov 07, 2023

Read the Wall Street Journal story here.

Watch the Senate Judiciary Committee hearing here.

 

WASHINGTON, DC – In advance of today’s Senate Judiciary Committee hearing featuring the new Facebook whistleblower, The Tech Oversight Project released the following resource guide for lawmakers, reporters, and allies tracking the event. Late last week, the Wall Street Journal reported on a disturbing new whistleblower report revealing that despite employee warnings, Mark Zuckerberg and Meta’s executive team ignored egregious and obvious harm to minors. The warnings showed that children were exposed to sexual harassment from adults on a widespread basis, and in response, Meta made it more difficult for minors to report abuse, and developed their own deceptive metrics to obscure the real scope of harm young people were experiencing.

“Everything in the new whistleblower report should make your stomach turn and make you mad as hell. These new revelations underscore Meta’s blatant disregard for the health and well-being of children who use their platforms, and they show that Mark Zuckerberg and Meta’s senior executive team were warned about the real-world harm kids were experiencing and let their cries for help fall on deaf ears,” said Sacha Haworth, Executive Director of the Tech Oversight Project. “It’s a political reality that Congress often needs a galvanizing moment fueled by public outrage to jolt lawmakers, break through the logjam, and push legislation over the finish line – even if it has broad bipartisan support and 50 cosponsors. That moment is now, and that bill is the Kids Online Safety Act. It’s been over 90 days since KOSA sailed out of committee unanimously, and it’s time to bring the bill to the floor and finally protect children online.”

The below resource guide includes highlights from the whistleblower report, additional background, and talking points.

Highlights from Whistleblower Report:

  • Fall 2021, Meta Consultant Arturo Bejar sent an email to Meta CEO Mark Zuckerberg, Chief Operating Officer Sheryl Sandberg, Chief Product Officer Chris Cox, and Instagram head Adam Mosseri detailing the harms and rampant sexual harassment young users (1 in 8 on Instagram) were experiencing on Meta’s platforms.  [View the email here.]
  • To discourage users from filing reports like sexual harassment, Meta added additional steps to the reporting process – a dark pattern – internal documents from 2019 show. [View the document here.]
  • The story details that Meta then shifted resources away from monitoring and reviewing reports of unacceptable content like pornography, terrorism, bullying, and excessive gore – now mostly relying on machine-learning models.
  • The shift to machine learning as the source of screening made the problem worse. A data scientist warned Guy Rosen, Facebook’s head of integrity at the time, that Meta’s classifiers were reliable enough to remove only a low single-digit percentage of hate speech with any degree of precision.
  • Meta’s rules didn’t clearly prohibit adults from engaging in sexual harassment of minors, like flooding the comments section on a teenager’s posts with kiss emojis or posting pictures of kids in their underwear, inviting their followers to “see more” in a private Facebook Messenger group.
  • In order to prove to Zuckerberg and Meta’s leadership its reliance on machine-learning models and internal “prevalence” statistics, Bejar and members of the Well-Being team built a new system called “Bad Emotional Experience Feedback.”
  • The effort identified problems with prevalence from the start: Users were 100 times more likely to tell Instagram they’d witnessed bullying in the last week than Meta’s bullying-prevalence statistics indicated they should.
  • To rectify the negative experience Instagram’s young users were experiencing, team members proposed capping the amount of beauty- and fashion-influence content, reconsidering AI-generated “beauty filters,” and building better ways for users to report unwanted contacts – all fell on deaf ears.
  • Instead, following Frances Haugen’s whistleblower report, Meta formalized new rules for employees’ internal communication. Among the mandates for achieving “Narrative Excellence,” as the company called it, was to keep research data tight and never assert a moral or legal duty to fix a problem.
  • Following Bejar’s departure from Meta, he began consulting with a coalition of state attorneys general who filed suit against the company late last month, alleging that the company had built its products to maximize engagement at the expense of young users’ physical and mental health.

Additional Background:

Talking Points for Allies and Partners:

  • Mark Zuckerberg and Meta were repeatedly warned that children were in harm’s way. Instead of protecting young people, they made it harder to report sexual harassment and unwanted contact.
  • While Meta says they support “Age-Appropriate” design bills, nothing could be further from the truth.
  • Meta repeatedly ignored warnings from employees and internal data, and crusaded against legislation that would protect kids online like KOSA and Age-Appropriate Design Code-model bills in states across the country.
  • Social media platforms will never protect children and teens unless we pass legislation forcing them to, which is why Congress needs to step up.
  • KOSA has nearly 50 co-sponsors, broad bipartisan support, sailed out of committee unanimously, and the support of President Biden.
  • The Senate needs to act, bring the KOSA to the Senate floor for a vote, and finally force social media companies to prioritize kids’ health and well-being over corporate profit.
Jump to Content