A Los Angeles jury has ruled that Meta and Google deliberately engineered their platforms to be addictive, finding both companies liable for the mental health harm suffered by a young woman who began using Instagram at the age of nine and YouTube at six.
The plaintiff, identified only as Kaley and now aged 20, was awarded three million dollars — a verdict her legal team described as sending an unmistakable message that no company stands above accountability when children are involved. Snap and TikTok, which were originally named as defendants, reached undisclosed settlements with Kaley before the case went to trial.
Jurors apportioned responsibility unevenly between the two remaining defendants, holding Meta 70 per cent liable and YouTube 30 per cent. A separate determination on punitive damages is still to be made by the court, which under California state law could amount to as much as thirty million dollars.
Kaley told the jury she had encountered no age-verification barriers when she first accessed either platform as a young child. She described withdrawing from family life as her usage intensified, and said she began experiencing anxiety and depression from the age of ten — conditions she was formally diagnosed with years later. She has since also been diagnosed with body dysmorphia, which her lawyers linked in part to her early and prolonged exposure to Instagram’s image-altering filters.
Central to Kaley’s case was the argument that features such as infinite scroll were not incidental but were deliberately constructed to maximise engagement and retain younger users, who were seen as more likely to remain on platforms long-term. Lawyers drew on testimony from former Meta executives and expert witnesses to support that claim.
Mark Zuckerberg appeared before the jury in February and pointed to Meta’s stated policy of barring users under the age of 13. When confronted with internal research suggesting the company was aware that younger children were nonetheless using its platforms, he said he had always wanted faster progress on age identification, and maintained the company had reached the right position over time.
Adam Mosseri, head of Instagram, was also questioned during proceedings. When told that Kaley’s longest single session on the platform had lasted sixteen hours, he declined to characterise it as evidence of addiction, describing it instead as “problematic”.
In a statement following the verdict, Meta said it disagreed with the outcome and was considering its legal options. Google had not responded to requests for comment at the time of publication.
The ruling came a day after a separate jury in New Mexico found Meta liable for exposing children to explicit content and contact with predators through its platforms — a sequence of decisions that analysts say reflects mounting public frustration with the social media industry. Research director Mike Proulx of Forrester described the back-to-back outcomes as evidence of a “breaking point” between these companies and wider society.
A further case against Meta and other platforms is scheduled to begin in a California federal court in June.
