Earlier this week news broke that Meta, TikTok, and YouTube are all spending time in court fighting scrutiny about their platforms causing the younger generation mental health issues. This motivated me to write a piece for today’s Binary Response on the topic because I think social media causes some adults mental health issues too. Please sign up to get our Binary Response articles directly in your inbox!
If you’ve spent years watching teenagers disappear into endless scrolls and muttering someone should do something, this week delivers. Meta, TikTok, and YouTube are on trial in Los Angeles over claims they deliberately fueled a youth mental health crisis through addictive design — think cigarettes, but digital.
The star plaintiff is K.G.M., a 19-year-old Californian who alleges these platforms hooked her as a kid, worsening her depression and suicidal thoughts through constant engagement. Her case targets the products themselves and the infinite scroll functionality which never lets you stop, autoplay queuing the next hit, push notifications that demand attention, and algorithms prioritizing session time and ad views over user sanity. This sidesteps Section 230 free-speech protections by framing it as defective product design — like Big Tobacco’s playbook that finally cracked under similar pressure.
But here’s the real spin, this isn’t just a kids these days crisis, and pretending it is misses the bigger picture. Those same mechanics shred adult brains too. Picture a 40-year-old doomscrolling politics at 2 a.m., heart racing as they chase validation from strangers in comment threads, or spiraling into group-chat outrage that keeps them up all night. Or a 35-year-old measures their real life — messy job, average house, normal family — against curated feeds of influencers living impossible dream lives, and feels that constant inadequacy gnaw away.
The trial spotlights youth for legal leverage because minors make sympathetic plaintiffs, but the core tricks — variable rewards like slot-machine pulls, endless social comparison, outrage amplification for stickiness — are universal, nudging adults toward the same insomnia, anxiety, depression, and low-grade panic that they’re always falling behind in some invisible race. We’ve all felt that compulsion to refresh one more time, even when we know it’s poison.
That’s what should terrify tech execs most. The complaint invokes slot machines and nicotine explicitly, claiming tech companies embedded features to exploit kids’ vulnerabilities not despite their developing brains, but because of them — for pure profit through maximized engagement. Pull-to-refresh mimics the lever pull, autoplay delivers the next drag without pause, recommendations dial in just the right hit of dopamine to keep you locked in. It hands jurors a simple, visceral question: is the business model itself the defect? No wonder Meta tapped lawyers who battled opioid litigation, and TikTok brought in video-game addiction specialists — they see this as existential, not just a PR headache.

Here’s the core contradiction platforms can’t escape. They can’t pitch advertisers on industry-leading teen engagement and unmatched session lengths in sales decks, then stroll into court and pretend to be shocked at claims that their product is addictive. You can’t boast about keeping young eyes glued to the feed for revenue, then hide behind we’re just a neutral pipe for user speech. The lawsuit hammers this, arguing kids like K.G.M. weren’t accidental casualties but intended targets of these predatory design choices. If jurors buy that narrative, Section 230’s shield thins dramatically — not over posted content, but over how the damn thing is wired to hook you.
The companies’ playbook is in full swing; Mental health is complex with many factors, we’ve added safety tools like parental controls, we care deeply about young people. Meta funds screen smart workshops in high schools; YouTube insists it’s fundamentally different from TikTok or Instagram and shouldn’t even share the defendant’s table. They’re not entirely wrong — real life’s pressures like school stress, family issues, and economic woes play roles, and no single algorithm ruins a kid solo. But complexity isn’t a get-out-of-jail-free card after years of ruthless engagement optimization, with meaningful guardrails tacked on only when lawsuits and headlines made it unavoidable.
And honestly, the adult angle makes their it’s complicated defense look even flimsier. We grown-ups are supposed to know better — have impulse control, life experience, the ability to log off — yet we still scroll through work meetings, refresh notifications obsessively, or let one viral post turn our group chats into echo chambers of rage. The same feeds driving teen anxiety fuel our burnout, political radicalization, and endless performative outrage. When middle-aged professionals admit they can’t escape the pull, it’s clear this goes beyond bad parenting or Gen Z screen addiction — it’s a design philosophy treating human attention, at every age, as raw material to mine for ads.

The settlements tell the real story. TikTok and Snap paid undisclosed sums to bow out pre-trial, dodging the spotlight. Innocent players don’t hush up to avoid jurors seeing internal emails plotting daily active minutes spikes in the 13-17 cohort. They settle when discovery could expose a template for thousands more suits — from individuals and school districts to state AGs, and likely adults piling on next. This bellwether’s outcome shapes them all.
Tech’s favorite line? We’re mirrors reflecting society back at you. Nonsense — a mirror doesn’t ping you at midnight, auto-queue the next clip, or tweak its reflection based on what kept your eyes longest. Jurors will dissect kid-focused research, design documents, and metrics celebrating how hard it is to quit. That’s not reflection; it’s a slot machine with comments, working its magic from 16 to 46.
Parents and schools aren’t off the hook — academics, safety, finances all factor in. But tobacco pulled the same complexity dodge, blaming genes and lifestyle over their engineered product. Design for addiction, flood the world with it, profit from the vulnerable — eventually, juries tune out the safety-blog noise.
Here’s the thing, folks: This trial will test if engagement at any cost is neutral innovation or liable design. Twelve everyday people decide if K.G.M.’s suffering was random bad luck or the predictable output of systems tuned to trap her. The same systems quietly warping adult sleep, self-image, and worldview.
With that… Social media’s truly on trial — not in vague debates, but over its model: max engagement everywhere, then feign surprise at the mental health wreckage. A verdict echoing 90s tobacco cases? Expect code rewrites — not just for kids, but all accounts.
When you and those you know use the technology every day you’re opinion might actually worth sharing.