A landmark trial begins
For the first time, major social media companies are being forced to defend themselves before a jury over claims that their platforms harm young people’s mental health. Starting Tuesday in Los Angeles, TikTok, Meta and Google’s YouTube will face a lawsuit brought by a 19-year-old identified as KGM and her mother, Karen Glenn.
The case alleges that the companies knowingly designed addictive features that damaged KGM’s mental health and contributed to self-harm and suicidal thoughts. Snap, originally named as a defendant, settled the case last week under undisclosed terms.
Why this case matters
The lawsuit seeks unspecified monetary damages, but its implications extend far beyond a single family. KGM’s case is one of several bellwether trials in a multi-district litigation involving roughly 1,500 similar personal injury claims against TikTok, Meta, YouTube and Snap.
The verdict could shape how these cases are resolved nationwide and potentially expose tech companies to billions of dollars in damages, as well as force changes to how their platforms are designed.
Claims of addictive design and harm
According to court filings, KGM began using social media at age 10, despite her mother’s efforts to block access. The lawsuit alleges that platform designs enabled minors to bypass parental controls and encouraged compulsive use through features such as endless scrolling, frequent notifications and algorithmic recommendations.
The complaint argues that these design choices coincided with a deterioration in KGM’s mental health, including anxiety, depression, body image issues and self-harm. It also alleges that Instagram and Snapchat facilitated interactions with strangers, including predatory adults.
KGM further claims she was bullied and sextorted on Instagram, and that Meta was slow to respond even after repeated reports from her family and friends.
Industry response and legal defenses
Tech companies have long rejected claims that social media directly harms youth mental health, citing a lack of definitive causal research and pointing to the benefits of connection and entertainment. They have also relied heavily on Section 230, a federal law that shields platforms from liability over user-generated content.
However, the judge overseeing the case has indicated that jurors should focus not just on content, but on whether product design features themselves contributed to harm.
Safety features under scrutiny
In recent years, the companies involved have rolled out a range of youth safety and parental control tools. Meta highlights its teen accounts, default privacy settings and AI-based age detection. YouTube points to content restrictions, parental controls and new options to limit short-form video scrolling. TikTok has introduced default privacy settings, late-night notification limits and features aimed at reducing excessive use.
Despite these measures, parents and advocates argue the changes fall short and came only after years of warnings and mounting evidence of harm.
A potential turning point
Advocates compare the proceedings to historic tobacco trials, framing them as a long-awaited moment of accountability for Big Tech. Over the coming weeks, top executives from Meta, TikTok and YouTube are expected to testify, placing corporate decisions under direct public scrutiny.
As the trial unfolds, a jury will decide whether these platforms merely hosted harmful experiences or actively engineered them. The outcome could mark a turning point in how social media companies are regulated and held responsible for the impact of their products on young users.
