As we approach the close of the 2023 legislative season, we are seemingly no closer to passing federal comprehensive privacy legislation, with no comprehensive privacy bill thus far managing to gain the same momentum that the American Data Protection and Privacy Act (the “ADPPA”) did last year—not even the major child-specific privacy legislation currently being considered (most prominently, the “Kids Online Safety Act” (KOSA) and The Children’s Online Privacy Protection Act 2.0 (“COPPA 2.0”)). Efforts to establish new online safety protections for children and teens have likewise faced roadblocks, with a federal judge blocking California’s novel Age-Appropriate Design Code (the “CA AADC”) from coming into effect pending the resolution of a lawsuit filed by NetChoice asserting that the CA AADC violates the First Amendment.
In the face of such legislative frustration, State Attorneys General (“state AGs” or “AGs”) are seeking to protect kids and teens online by testing out a new legal strategy: suing social media companies for allegedly intentionally designing their platforms to be addictive for young users, causing users to spend additional time on social media sites or to use these sites in a way that runs against their best interests or intentions. On October 24, 2023, 41 state AGs sued Meta for asserting that its platform was safe for use by young users even while allegedly intentionally designing its website to be addictive for these same youngThese claims are likely inspired by widespread reporting about addictive User Experience (“UX”) design as well as revelations of whistleblowers who have accused Meta of looking the other way in the face of evidence about the negative impact their platforms can have on children and teens.
The lawsuit specifically asserts that the following features, many of which are commonly used across apps and platforms, help foster addiction in young people:
- “Algorithmic recommendation and sequencing;”
- “Public display and quantification of engagement metrics such as Likes;”
- “Face and body image manipulation filters;”
- “Disruptive audiovisual and haptic alerts;”
- “Infinite scroll and autoplay formats;”
- “Permitting and encouraging users to create multiple accounts; and”
- "‘Ephemeral’ presentation of social content.”
The lawsuits also accuse the company of collecting children’s data without first obtaining parental consent in violation of the existing Children’s Online Privacy Protection Act (“COPPA”).
Critics contest many of the claims made by the set of lawsuits, arguing that there is “no evidence of any actual causal connection” between social media use and mental health issues for young people. In addition, they suggest that the complaint goes too far where it asserts that algorithmic feeds are a form of manipulative or addictive design, arguing that a wide range of users generally prefer algorithmically sorted feeds over their chronological counterparts, which can be overwhelmed by spam and low-quality information.