States Unite in Lawsuit Against Meta, Claiming the Addictive Nature of its Platforms Are Triggering a Mental Health Crisis Among Younger Users

Taylor GilbertsonNews & Insights

In a case that could reshape the digital landscape, Meta, the parent company of Facebook and Instagram, is caught in a contentious legal battle against several states. In People of the State of California v. Meta Platforms, Inc. et al., more than twenty-five attorneys general allege that Meta intentionally engineered addictive features on its platforms to maximize young users’ time on its applications––fueling a mounting mental health crisis across the United States.[1] The attorneys general also claim that Meta violated the Children’s Online Privacy Protect Act, (“COPPA”), in addition to various state-specific unfair business practices.[2]

According to the lawsuit, Meta devised a four-part scheme to exploit younger users for profit on its Facebook and Instagram platforms.[3] First, Meta created a business model targeting young users for the purposes of increasing their time spent on its social media applications.[4] For example, Meta tracks users’ data and activity to provide consumers with tailored recommendations.[5] These recommendations are displayed on the home or explore page from the moment a user accesses the platform to encourage compulsive use.[6] Second, Meta deliberately constructed psychologically harmful features to fuel compulsive and extended use among its younger users.[7] For example, Meta’s “filters,” which alter the appearance of users’ faces, have been linked to body dysmorphia, eating disorders, anxiety, and depression.[8] Additionally, Meta’s “like” feature, which tracks the amount of people who favor a user’s post, can be particularly harmful to younger users because their developing brains are more susceptible to the resulting dopamine rush, which induces the desire for prolonged interaction with the platforms.[9] Third, Meta routinely publishes misleading reports that boast a low user harm and therefore misrepresent the safety of the sites.[10] Finally, Meta intentionally refuses to abandon these practices despite an abundance of research and expert analysis that suggest these features are resulting in real harm to young users.[11]

Additionally, the attorneys general claim that Meta breached its COPPA duties.[12] Meta’s alleged COPPA noncompliance stems from a federal requirement that Big Tech companies must obtain parents’ informed consent before collecting their children’s data and personal information.[13] Meta is currently non-compliant on its Instagram and Facebook platforms.[14] The attorneys general argue that Meta’s actions have resulted in deceptive and unfair or unconscionable practices in violation of various state laws, and the states are seeking relief to render Meta’s design features unlawful under state consumer protection laws[15] and to mandate Meta’s COPPA compliance.[16]

Meta challenges these allegations, stating that it takes measures to ensure platform safety for young users.[17] Meta’s Public Affairs Director Nkechi Nneji said that the company collaborated with both parents and experts to develop over thirty tools designed to safeguard teens while using the app.[18] She further added that Meta employs countless individuals who are solely dedicated to keeping young people safe online.[19]  In a statement, the company said “[w]e [are] disappointed that instead of working productively with companies across the industry to create clear, age-appropriate standards for the many apps teens use, the attorneys general have chosen this path.[20]

Conversely, Arturo Bejar, who was a part of Facebook’s Protect and Care Team and Instagram’s Well-Being Team––which centered around safeguarding young people––recently disputed Meta’s safety claims at a Senate judiciary subcommittee hearing. Bejar stated that algorithms for Facebook and Instagram promote bullying, drug abuse, eating disorders, and self-harm, and that Meta’s top executives failed to respond when he raised these alarms.[21]

If Meta is unable to dismiss the case, then it is likely to invoke Section 230 as a defense against liability. Section 230 of the 1996 Communications Decency Act states that “no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.[22]”This typically shields major companies from lawsuits regarding their content moderation decisions.[23] Nevertheless, Section 230 may not be a sufficient defense since this case is about algorithm features––not content moderation. Law professor and author of Liar in a Crowded Theater: Freedom of Speech in a World of Misinformation Jeff Kosseff recently commented on the possibility of a Section 230 defense.[24]

 “I think that it’s a very close call as to whether Meta would succeed with a Section 230 defense,” said Kosseff.[25] “Courts are increasingly willing to conclude that Section 230 is not a defense in lawsuits arising from claims about product design, though the line is not always clear.”[26]

Indeed, U.S. District Court Judge Yvonne Gonzalez Rogers rejected arguments from Meta and Alphabet (which operates Google and Youtube), that the Section 230 defense established immunity from similar lawsuits.[27] Judge Rogers explained that companies could be held liable for their failure to design reasonably safe products and to warn users of potentially harmful defects.[28]

Aside from this case, Meta is facing additional legal scrutiny from eight more attorneys general who are bringing similar suits, as well as an individual suit from the state of Florida, for a combined total of forty-one lawsuits against Meta filed by Attorneys General.[29]

Regardless of the outcome, this is a case saturated with complexity that raises important questions regarding consumers’ future relationship with Big Tech companies like Meta. In world where companies can create addictive algorithms with impunity, the ramifications, especially for minors, could be devastating. Whether innovation interests and concerns about too much regulation can compete with the interests of the youth remains to be seen.

[1] See generally Pl.’s Compl., People of the State of California v. Meta Platforms, Inc. et al., Case No. 4:23-cv-05448 (N.D. Cal.  2023) [ECF No. 1].

[2] Id. at 2.

[3] Id. at 1.

[4] Id. 

[5] Id. at 28.

[6] Id.

[7] Id. at 1.

[8] Id. at 23, 56-57.

[9] Mike Synder, 41 States Sue Meta, alleging that Instagram and Facebook is harmful, addictive for kids, USA Today (Nov. 17, 2022),

[10] Compl., supra note 1 at 1.

[11] Id.

[12] Id. at 3.

[13] Id.

[14] Id. at 4.

[15] Bobby Allyn, States sue Meta, claiming Instagram, Facebook fueled youth mental health crisis, NPR (Nov. 10, 2023),

[16] Supra, note 1 at 145.

[17]Nick Barclay, Dozens of States Sue Meta over youth mental health crisis, The Verge (Nov. 17, 2023),

[18] Dara Kerr, Meta failed to address harm to teens, whistleblower testifies as senators vow action, NPR (Nov. 10, 2023),

[19] Id.

[20] Allison Morrow, Meta has managed to get 33 states to agree on something, CNN (Nov. 10, 2023),

[21] Dara Kerr, Meta failed to address harm to teens, whistleblower testifies as senators vow action, NPR (Nov. 10, 2023),

[22] Barbara Ortutay, What you should know about section 230, the rule that shaped today’s internet, PBS News (Nov. 10, 2023),

[23] Brian Fung, Federal judge hints that Big Tech companies may have to face consumer allegations of mental health harm, CNN (Nov. 10, 2023),

[24] Allyn, supra note 15.

[25] Id.


[27] Nate Raymond and Jonathan Stempel, Social media companies must face youth addiction lawsuits, US judge rules, Reuters (Nov. 17, 2023),

[28] Id.

[29] Ortutay, supra note 22.