Yesterday, a Los Angeles jury delivered a verdict that parents, advocates, and child safety organizations have been waiting years to see: Meta (Instagram) and Google (YouTube) were found liable for the harm their platforms caused a young woman who became addicted to social media as a child. This is historic. And if you're a parent who has ever worried about what your child is experiencing on their phone — this one's for you.
What Was the Trial About, and What Happened?
The case centered on a now 20-year-old woman, identified as Kaley, who began using YouTube at age 6 and Instagram at age 9. By the time she was a teenager, she was on social media "all day long," and she says the addiction exacerbated her depression and suicidal thoughts. Her legal team argued that Meta and YouTube deliberately engineered their platforms with features designed to hook young users — think infinite scroll, autoplay, and algorithmic notifications — and that the companies knew they were causing harm and did it anyway.
The trial wasn't about what kids see on social media. It was about how these platforms were built. That's a crucial distinction. And the jury agreed.
After more than 40 hours of deliberation across nine days, the jury found both companies negligent. They ordered Meta and YouTube to pay $6 million total in damages, including punitive damages, after finding the companies acted with "malice, oppression, or fraud."
Internal documents presented during the trial were damning. One Meta memo read: "If we wanna win big with teens, we must bring them in as tweens." Another showed that 11-year-olds were far more likely to return to Instagram repeatedly despite the platform's own rules requiring users to be at least 13.
Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri both testified. TikTok and Snap, originally named in the suit, settled before the trial began.
This is the first time in U.S. history that social media companies have faced trial and been found liable for addictive platform design. It won't be the last.
Why This Matters — Like, Really Matters
We know $6 million is a rounding error for companies worth hundreds of billions of dollars. But that's not the point.
The point is that the dam is breaking.
This case was selected as a bellwether trial, meaning its outcome is designed to guide the resolution of more than 1,600 similar lawsuits currently pending against social media companies. Losses could put these tech giants on the hook for billions of dollars and, more importantly, force them to fundamentally change how their platforms are built.
Some are already comparing this to the Big Tobacco litigation of the 1990s: a legal crusade that eventually forced an entire industry to stop targeting children with advertising. That fight took decades. But once the first domino fell, everything changed.
And the dominoes are falling fast. Just one day before the LA verdict, a separate jury in New Mexico ordered Meta to pay $375 million for failing to protect children from predators on Instagram and Facebook. California's Attorney General is preparing for his own trial in August. A federal trial involving school districts and parents from across the country is set to begin this summer.
For every parent who was told they were overreacting. For every family who watched their child struggle with anxiety, with body image, with the impossible pull of a screen they can't put down, and couldn't get anyone to listen. For every family that lost a child to the harms of these platforms and had to fight for justice in the middle of unimaginable grief. This verdict is for you. You were right. And the courts are finally starting to agree.
So What's Next? What Do Parents Need to Know?
Here's the honest truth: this is the beginning of accountability, not the end of the fight. Meta and YouTube have both said they plan to appeal. More trials are coming. The legal process is slow. And in the meantime, kids are still scrolling.
So while we celebrate this moment, parents shouldn't wait for the courts to protect their kids. Here's what you can do right now:
1. Have the conversation. Use this verdict as an opening. Tell your kids what happened and why it matters. Explain that the very features that make these apps feel impossible to put down were designed that way on purpose — it's not a willpower issue, it's an engineering one. That reframe can be really powerful for kids who feel ashamed of how much time they spend online.
2. Set real limits. Screen time limits and app restrictions aren't just good parenting; they're now court-validated common sense. The jury heard extensive testimony about infinite scroll, autoplay, and notification design. If those features are harmful enough to argue about in court, they're worth limiting at home.
3. Know what your child is actually experiencing. These platforms are designed to keep kids engaged at all costs. That means the content that gets surfaced isn't neutral — it's optimized to provoke emotion. Depression, anxiety, body image struggles, and radicalization can all be accelerated by algorithmic feeds. Staying involved in your child's digital life isn't helicopter parenting; it's protection.
4. Stay informed. More verdicts are coming. More legislative changes are likely. Follow organizations that are actively working on this so you know what's changing and how it affects your family.
How Bark Helps
Bark exists because we believe parents shouldn't have to choose between giving their kids the independence they need and keeping them safer online. The trial made clear that these platforms were designed to be addictive — and that the companies knew children were being harmed. Bark helps bridge the gap between what platforms should be doing (and aren't) and what families need to help keep their kids safe online today.
Our monitoring technology works in the background across texts, email, and 30+ social media platforms and apps, alerting you only when something concerning is detected, like signs of depression, anxiety, cyberbullying, sexual content, or predatory behavior. This way, you don’t have to read every message, but you’re getting notified when it matters.
We also offer screen time management, website and app blocking, and location sharing tools, so you can build healthy digital habits alongside your kids, not just react when things go wrong.
The verdict doesn't make social media safe. But Bark helps make it more manageable for your family, right now, while the legal and legislative battles continue to play out. Check out our suite of parental control products to learn how Bark can help your family.
The fight isn’t over, but this verdict is a huge step in the right direction. And we’re with you in this, every step of the way.
Bark helps families manage and protect their children’s digital lives.
