NEW YORK (Reuters) – A former employee of Meta, the parent company of Facebook and Instagram, testified before a U.S. Senate subcommittee on Tuesday, alleging that the company was aware of the harassment and other harms faced by teenagers on its platforms but failed to address these issues.
Arturo Bejar, the former employee, had worked on well-being for Instagram from 2019 to 2021 and had previously served as a director of engineering for Facebook’s Protect and Care team from 2009 to 2015.
Bejar provided his testimony before the Senate Judiciary Subcommittee on Privacy, Technology, and the Law during a hearing focused on social media’s impact on the mental health of teenagers.
In written remarks made available before the hearing, Bejar stated, “It’s time that the public and parents understand the true level of harm posed by these ‘products,’ and it’s time that young users have the tools to report and suppress online abuse.”
Bejar’s testimony aligns with a bipartisan effort in Congress to pass legislation that would require social media platforms to offer parents tools to protect their children online.
During his time at Meta, Bejar’s objective was to influence the design of Facebook and Instagram in ways that would encourage positive behaviors among users and provide tools for young individuals to manage unpleasant online experiences, as he explained during the hearing.
Meta, in a statement, expressed its commitment to safeguarding young individuals online. The company pointed to its support for user surveys mentioned in Bejar’s testimony and the creation of tools such as anonymous notifications for potentially hurtful content. The statement read, “Every day countless people inside and outside of Meta are working on how to help keep young people safe online. All of this work continues.”
Bejar informed senators that he had regular meetings with senior executives at the company, including Chief Executive Mark Zuckerberg, and initially found them supportive of the work being done. However, he later concluded that the executives had consistently chosen not to address the issue.
Bejar mentioned an email from 2021 in which he brought to the attention of Zuckerberg and other top executives internal data revealing that 51% of Instagram users had reported having a bad or harmful experience on the platform within the past seven days. Among users aged 13-15, 24.4% reported receiving unwanted sexual advances.
In a separate document, it was revealed that 13% of all 13-to-15-year-old Instagram users surveyed had experienced unwanted advances.
Bejar also informed the executives that his own 16-year-old daughter had encountered misogynistic comments and obscene photos without adequate reporting tools. The existence of this email was initially reported by the Wall Street Journal.
During his testimony, Bejar recounted a meeting where Meta’s Chief Product Officer Chris Cox was able to provide precise statistics on teen harm off the top of his head. Bejar found this heartbreaking, as it suggested that the company was aware of the issues but had not taken appropriate action. Bejar had met with two senators last week who sponsor the Kids Online Safety Act, sharing evidence that Meta executives were aware of harm to young individuals on the company’s platforms but failed to act on it.