Have social media giants been ignoring the warning signs of social media harm?

In 2022, over 80 cases were brought together in a federal court in California against social media giants – primarily naming Meta and Instagram.

Have social media giants been ignoring the warning signs of social media harm?
Meta CEO Mark Zuckerberg testifies at a House Financial Services Committee hearing in Washington, U.S., October 23, 2019. REUTERS/Erin Scott/File Photo

The backstory: Back in 2021, Frances Haugen, an ex-employee of Meta Platforms (the company that owns Instagram and Facebook), came out with some shocking internal documents revealing the company was knowingly choosing profits over community safety. She also confirmed that Facebook was aware of the damage its app Instagram had on vulnerable young people. Specifically, an internal study at Instagram found that many teenage girls using the app were struggling with depression and anxiety because of body image issues.

More recently: In 2022, over 80 cases were brought together in a federal court in California against social media giants – primarily naming Meta and Instagram. These cases were on behalf of young people saying these social media sites were the cause of their anxiety, depression and eating disorders and that they even contributed to some suicides. These cases are still pending.

The development: But here's where it gets even crazier – an unredacted version of the court filing just revealed how much Meta and TikTok's parent company, ByteDance, allegedly knew about the dangerous effects their platforms had on minors. The filing says Meta CEO Zuckerberg was personally warned and that rather than take proactive measures to protect children on the app, Meta defunded its mental health team (which a Meta spokesperson has said isn't true). It also says ByteDance's internal docs showed the company knew young people are more prone to falling for dangerous social media challenges.

The social media giants may rely on a 1996 law that protects internet platforms from being liable for things users post. They also have their eye on an ongoing Supreme Court case over just how culpable these companies are for hosting harmful content.

Key comments:

"In fact because this so important our company, we actually increased funding, shown by the over 30 tools we offer to support teens and families," said a Meta spokesperson regarding the allegation that the company cut mental health funding. "Today, there are hundreds of employees working across the company to build features to this effect."

"No one wakes up thinking they want to maximize the number of times they open Instagram that day," wrote one Meta employee in 2021, according to the filing. "But that's exactly what our product teams are trying to do."

"These never-before-seen documents show that social media companies treat the crisis in youth mental health as a public relations issue rather than an urgent societal problem brought on by their products," said a statement by three lawyers leading the lawsuit, Lexi Hazam, Previn Warren and Chris Seeger. "This includes burying internal research documenting these harms, blocking safety measures because they decrease 'engagement,' and defunding teams focused on protecting youth mental health."

"All sorts of services may be 'addictive' in the habit-forming sense—from television to video games to shopping for clothes—but the law does not impose liability simply for creating an activity which some may do too much, even where, as here, it allegedly results in tragedy," said Snap Inc., arguing the dismissal of a case where a mom said her daughter developed sleep deprivation and depression while using Instagram, eventually committing suicide.

"If anything, the argument that the algorithms are to blame really highlights what the plaintiffs are suing about is third-party content," said Eric Goldman, a Santa Clara University law professor.