Dozens of US states, including California and New York, are suing Meta Platforms Inc. for harming the mental health of young people and contributing to a youth mental health crisis over deliberately designed features on Instagram and Facebook that lure children onto its platforms.
The lawsuit, filed in federal court in California, also alleges that Meta routinely collects data on children under 13 without parental consent, in violation of federal law.
“Metta has used powerful and unprecedented technologies to lure, engage, and ultimately ensnare youth and adults. With its motive for profit and maximizing financial gain, Meta has repeatedly misled the public about the inherent dangers of its social media platforms,” the complaint states. by which these platforms exploit and manipulate its most vulnerable users: teenagers and children.
In addition to the 33 states, nine other attorneys general are filing filings in their states, bringing to 42 the number of states taking action.
“Children and young adults are experiencing record levels of poor mental health, and social media companies like Meta are to blame,” New York Attorney General Letitia James said in a statement. “Meta has taken advantage of children's pain by deliberately designing its platforms with manipulative features that make children addicted to their platforms while lowering their self-esteem.”
In a statement, Meta said it shares “the Attorney General's commitment to ensuring teens have a safe, positive experience online and has already implemented more than 30 tools to support teens and their families.”
“We are disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps used by adults, the attorneys general have chosen this path,” the company added.
The wide-ranging lawsuit is the result of an investigation led by a bipartisan coalition of attorneys general in California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont. This follows scathing newspaper reports, first reported by The Wall Street Journal in the fall of 2021, based on Meta's own research, which found that the company was aware of the harm Instagram could cause to teenagers, especially teenage girls, when it comes to mental health. . and body image issues. In one domestic study, 13.5 percent of teenage girls said Instagram worsened suicidal thoughts, and 17 percent of teenage girls said it worsened eating disorders.
After the first reports, a consortium of news organizations, including the Associated Press, published their own findings based on leaked documents from whistleblower Francis Haugen, who testified before Congress and a British parliamentary committee about what he found.
Social media use among teenagers is almost universal in the US and many other parts of the world. According to the Pew Research Center, 95 percent of 13- to 17-year-olds in the U.S. report using a social media platform, with more than a third saying they use social media “almost all the time.”
To comply with federal regulations, social media companies prohibit children under 13 from signing up on their platforms — but children have been shown to easily circumvent the bans, with or without parental consent, and many young children have social media accounts. .
Other measures that social platforms have taken to address concerns about children's mental health are also easily circumvented. For example, TikTok recently introduced a default 60-minute time limit for users under 18. But once the limit is reached, minors can simply enter a passcode to continue watching.
In May, U.S. Surgeon General Dr. Vivek Murthy urged tech companies, parents and educators to take “immediate action to protect children now” from the dangers of social media.
Associated Press writers Maisun Khan in New York and Ashraf Khalil in Washington contributed to this story.