A jury in Los Angeles has found Meta and Google liable in a lawsuit filed by a woman identified in court as Kaley. She claims she developed an addiction to Instagram and YouTube after using the platforms since childhood. This case is among the first to hold social media companies responsible based on how their platforms are designed, rather than on user-generated content.
Jurors concluded that both companies acted negligently and did not warn users about the risks of long-term platform use. They also determined that certain design features, such as recommendation systems, push notifications, and auto-play options, played a role in the mental health issues the plaintiff reported.
Damages Awarded In The Meta And Google Social Media Addiction Case
The jury awarded three million dollars in compensatory damages and supported additional punitive damages. Responsibility was shared between the two companies, with Meta receiving the larger portion. The final amounts for punitive damages have not been confirmed in the source material.
In a separate case, a jury in New Mexico ordered Meta to pay 375 million dollars after finding violations related to child safety protections. Mark Zuckerberg testified during the Los Angeles proceedings, and internal company documents were presented as evidence.
Legal Arguments From Both Sides In The Los Angeles Trial
The plaintiff’s legal team argued that the platforms were designed to promote compulsive use, which made it difficult for younger users to disconnect. Kaley shared her experience of developing body dysmorphia, depression, and suicidal thoughts after years of nearly constant use of both platforms.
Lawyers representing Meta and Google countered, saying that the plaintiff’s mental health issues were more related to her personal circumstances than to platform use. They also questioned the classification of social media addiction as an officially recognized medical condition.
Broader Legal Context And What The Verdict Could Mean Next
The case is part of a broader legal strategy focused on how platforms are designed, rather than on what users post. This approach appears to circumvent legal protections under Section 230 of the Communications Decency Act, which generally shields platforms from liability for third-party content.
Hundreds of similar cases are pending throughout the United States, filed by parents, school districts, and state officials. Some earlier cases involving TikTok and Snap were settled before this trial finished.
The verdict in Los Angeles could influence how courts in other ongoing cases view the connection between platform design choices and user harm. Meta and Google have not yet stated whether they plan to appeal the ruling.

