Meta accused of burying internal research showing facebook, instagram harm users’ mental health

Meta shut down internal research after finding causal evidence that Facebook and Instagram negatively affected users’ mental health, according to newly unredacted court filings in a major class-action lawsuit brought by U.S. school districts.

The filings describe a 2020 Meta research initiative called “Project Mercury,” in which company scientists — working with survey firm Nielsen — studied the effects of users deactivating Facebook and Instagram.

According to internal documents revealed in discovery:

  • Users who stopped using Facebook for one week reported lower depression, anxiety, loneliness, and social comparison.
  • Meta executives were disappointed by the results and deemed them tainted by “existing media narratives.”
  • The company stopped the research instead of publishing the findings or conducting further studies.

Privately, however, Meta’s internal researchers acknowledged to then–global public policy head Nick Clegg that the findings were valid.

One staffer wrote that the data “does show causal impact on social comparison,” adding a sad emoji. Another compared withholding the research to the tobacco industry hiding evidence of harm.

Despite these findings, Meta told Congress on multiple occasions that it had no way to quantify mental health harm — especially for teenage girls.

Meta Responds

Meta spokesperson Andy Stone said the study was halted due to flawed methodology, adding that the company has spent over a decade improving safety features.

“The full record will show that… we have listened to parents, researched issues that matter most, and made real changes to protect teens,” Stone said.

Meta has also moved to strike the unredacted documents, arguing that plaintiffs are seeking too broad a disclosure.

Broader Allegations Against Meta, Google, TikTok, and Snapchat

The claims surface in a comprehensive Friday filing by law firm Motley Rice, representing school districts nationwide. The lawsuit alleges major social media platforms:

  • Intentionally hid known risks of their products.
  • Failed to protect minors from sexual predators and harmful content.
  • Encouraged under-13 use of their platforms.
  • Sought to expand teen engagement, even during school hours.

Other platforms offered limited or no comment.

Among the more striking allegations:

Meta-specific claims from internal documents:

  • Safety features for youth were designed to be ineffective or rarely used.
  • Meta blocked A/B testing of safety tools that might hurt growth metrics.
  • It supposedly allowed users to be caught 17 times attempting human trafficking before removal — a threshold staff called “very, very, very high.”
  • Meta knew its engagement-boosting algorithms exposed teens to more harmful content.
  • Efforts to curb child predation were delayed due to growth concerns.
  • Mark Zuckerberg allegedly said in 2021 he would not describe child safety as his “top concern,” listing the metaverse as a higher priority.
  • Requests from Clegg for more funding for child safety were allegedly ignored or dismissed.

Stone strongly denied these claims, saying Meta removes sex trafficking accounts immediately and that teen safety measures are “broadly effective.”

TikTok’s PTA Influence Claim

One section of the filing alleges TikTok tried to influence the National PTA by sponsoring the group and later boasting internally that the PTA would “do whatever we want” regarding public messaging.

TikTok, Google, and Snapchat have not responded to these allegations.