Arguments in a Landmark Social Media Addiction Trial Start Next Week. This Is What’s at Stake

0
3

Annalee Schott used to live in rural Colorado where the farm, the barn, and the horses were her happy place. But online she was drawn into a dark world. The 18-year-old’s TikTok algorithm allegedly presented her with content—including a live suicide on her “For You” page—that impacted her self-worth and exacerbated her anxiety and depression. She was so addicted to social media that her mother, Lori Schott, says she would have to lock her daughter’s smartphone in the car.

In 2020, Annalee died by suicide. Six years later, Lori is one of the approximately 1,600 plaintiffs who have filed lawsuits from all over the country alleging that Meta, Snap, TikTok, and YouTube built addictive products which led the children to depression, self-harm, and other mental health issues. The cases have been filed by over 350 families and 250 school districts. The first of them—that of a 20-year-old woman who goes by the identifier K.G.M.— is expected to go to trial next week, with opening statements scheduled in front of a jury in Los Angeles. The trial may last six to eight weeks.

“It is a time that we have all been fighting for, and it’s a time that is owed to us to get answers from these companies on how they designed these platforms to addict our kids,” Schott told WIRED, echoing what is alleged in these lawsuits. “This trial isn’t just about Annalee. It’s about every child that was lost or harmed, and these companies knew the decisions they made put our kids’ lives at risk every single day.”

This is the first time major social media companies will face a jury trial for the alleged impact of their design on users—in this case, young ones. Legal experts say that similar cases have often been dismissed at early stages because of Section 230, a law that offers social media companies immunity from liability related to the user-generated content posted on their platforms.

“The fact that we are simply able to start a trial is a monumental victory on behalf of families,” Matthew Bergman, founder of the Social Media Victims Law Center and an attorney representing around 1,200 plaintiffs, told WIRED as he stood outside the Los Angeles courthouse. “We will expect testimony from the corporate executives at the highest level, we will expect documents that have never seen the light of day to be made public, we will expect the social media companies to blame everybody except themselves.”

K.G.M.’s is the first lawsuit to be picked by the court as a so-called “bellwether” trial. Bellwether trials typically occur in situations where there are a large number of plaintiffs who have filed a lawsuit against the same defendant (or defendants) for harm by the same products. A small number of cases are handpicked as test cases to be representative of all the large pool of plaintiffs. The goal of such trials is to help foresee what the future litigation of all cases might look like.

This case has gotten so far because it’s built on an argument that tries to sidestep Section 230. The plaintiffs’ focus is not the liability of the content, but the alleged business decisions that shape these platforms. If the legal argument in this trial proves successful, experts believe it could force social media companies to prioritize safety in a way they have not to this point.

“This is going to be the first time a jury is going to hear arguments about what the social media companies knew about the risks of the design of their platforms and how they acted on the types of information they had,” says Haley Hinkle, policy counsel at Fairplay, an organization that works to protect kids from Big Tech. The jury will ultimately decide, she says, whether the companies were negligent, if they contributed to mental health harms, and if they should have warned young users about the risks.

Google and Meta both deny the allegations in the complaint. “Providing young people with a safer, healthier experience has always been core to our work,” said Google spokesperson José Castañeda in a statement. “In collaboration with youth, mental health, and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls.”

“For over a decade, we’ve listened to parents, worked with experts and law enforcement, and conducted in-depth research to understand the issues that matter most,” said Meta spokesperson Stephanie Otway in a statement. “We use these insights to make meaningful changes—like introducing Teen Accounts with built-in protections and providing parents with tools to manage their teens’ experiences.”

The Bellwether Case

K.G.M started watching YouTube at the age of six, had an Instagram account when she was 11, got on Snapchat at 13, and TikTok one year after—with each app allegedly furthering “her spiral into anxiety and depression, fueled by low self-esteem and body dysmorphia,” according to her attorney Joseph VanZandt. She, along with her mother Karen Glenn, filed a lawsuit against Meta, Google’s YouTube, Snap, and TikTok alleging that features such as “autoplay” and “infinite scroll” contributed to her social media addiction, and that social media use contributed to her anxiety and depression, making her feel more insecure about herself. (Snap and TikTok settled the case with KGM before the trial. Terms were not disclosed.)

Glenn testified last year that she did not realize the harm these platforms could do to her daughter, and that she wouldn’t have given her a phone if she’d known about these harms previously. Bergman says K.G.M’s lawsuit has been chosen as the “bellwether” case because she is “representative of so many other young women who have suffered serious mental health harms and emotional ailments and disturbances as a consequence of social media.”

“The goal of the attorneys bringing these cases is not just to prevail and receive compensation for their individual clients,” says Benjamin Zipursky, a law professor at Fordham University School of Law. “They aim to get a series of victories in this sampling of so-called ‘bellwether trials.’ Then they will try to pressure the companies into a mass settlement in which they pay out potentially billions of dollars and also agree to change their practices.”

K.G.M’s is the first of 22 such bellwether trials to be held in the superior court of Los Angeles. A positive outcome in the favor of the plaintiff could give the remaining roughly 1,600 litigants significant leverage—and potentially force tech companies to embrace new safeguards. The trial also promises to raise broader awareness about social media business models and practices. “If the public has a very negative reaction to what emerges, or what a jury finds, then this could affect legislation at the state or federal level,” Zipursky adds.

Bergman, who spent 25 years representing asbestos victims, says this trial feels like a repeat of what happened in the past. “When Frances Haugen testified in front of Congress and for the first time revealed what social media companies know their platforms are doing to get vulnerable young people, I realized that this was asbestos all over again” says Bergman.

Dividing Lines

Seeking to draw parallels from product liability cases against Big Tobacco and the automotive industry, the principal argument that the plaintiffs are alleging is that major tech companies designed their social media platforms in a negligent manner, meaning they did not take reasonable steps to avoid causing harm. “Specifically, the plaintiffs are arguing that design features such as infinite scroll and autoplay caused certain injuries to minors, including disordered eating, self-harm, and suicide,” says Mary Anne Franks, a law professor at George Washington University.

On the other side, the tech companies will likely focus on causation and free speech defenses. “The defendants will argue that it was third-party content that caused the plaintiffs’ injuries, not the access to this content that was provided by the platforms,” says Franks. The companies may also likely argue, she says, “that to the extent the companies’ decision-making about content moderation is implicated, that decision-making is protected by the First Amendment,” citing the US Supreme Court’s 2024 ruling in Moody v. Netchoice.

One of the major arguments between the two parties will likely be over the term “social media addiction” itself, says Eric Goldman, a law professor at Santa Clara University. “There is no medical or psychological definition of social media addiction that has been widely recognized, and there’s no legal standard that recognizes the addiction,” says Goldman. “And so the parties are going to fight fiercely over whether or not there is even a thing called social media addiction.”

He further claims that even if there is a thing called social media addiction, in this case the users were “addicted to talking to each other online” and by content not created by the platform itself—which is why he believes that “Section 230 casts a very long shadow.” “The claim the plaintiffs have tried to structure claims they were suing based on the social media services design choices,” he says. “But those design choices were designing how to gather, organize, and disseminate third-party content, and that’s why Section 230 may very well be in play.”

Bergman, however, argues that Section 230 is outdated and doesn’t cover the complexities of today’s online world. “Section 230 was enacted when Netscape was the largest browser, Google didn’t exist, and Mark Zuckerberg was in junior high school,” says Bergman. “So it has unfortunately given rise to this sense of impunity within the social media industry, which is the only way one can understand how they ever would have designed such dangerous platforms knowingly.”

Experts say that while there isn’t clear evidence that social media is a causal factor in mental health problems, that doesn’t mean the platforms don’t need reform. “The actual scientific literature from the mental health world is quite nuanced and complicated, and doesn’t necessarily say that, on average, there is an addictive quality or that it’s causing mental health conditions,” says Jessica Schleider, founding director of the Lab for Scalable Mental Health at Northwestern University. “But that doesn’t discount the experiences of the individuals in this case.”

For Lori Schott, the fact that there is a trial feels like a victory already. “This is our day in court,” she said. “This is the court of public opinion—there’s a judge in there who has been guiding this case along; we have attorneys that have become our life to fight this. We have already won.”

If you or someone you know needs help, call 1-800-273-8255 for free, 24-hour support from the National Suicide Prevention Lifeline. You can also text HOME to 741-741 for the Crisis Text Line. Outside the US, visit the International Association for Suicide Prevention for crisis centers around the world.

Disclaimer : This story is auto aggregated by a computer programme and has not been created or edited by DOWNTHENEWS. Publisher: wired.com