Social media companies are under fire lately, and Section 230 may no longer provide a solid defense.
This week, Meta was slapped with eight different complaints, filed in as many states, alleging that its algorithms have contributed to mental health issues such as eating disorders, sleeplessness and suicidal thoughts or tendencies in younger users. The complaints allege that excessive time on Instagram and Facebook pose serious risks to mental health, with one plaintiff claiming that Meta “misrepresented the safety, utility, and non-addictive properties” of its platforms.
Rather than going after the content itself, these complaints target the algorithm that serves it to users — making Sec. 230, which states that Meta can’t be held legally liable for third-party content posted on its platform, no longer applicable.
A federal appeals court last year ruled that Snap could be held accountable for its speed filter, which allegedly encouraged reckless driving and caused a fatal car crash in 2017. That ruling opened the door for lawsuits like the ones filed against Meta, said Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University. The plaintiffs in the Snap case argued that the speed filter wasn't considered third-party content, but was rather a design choice Snap itself had made. Because the court ruled that Snap wasn’t protected by Sec. 230 in this case, others are attempting to work around the law in similar fashion.
But, Goldman argued, the Snap ruling and the cases against Meta are “qualitatively different,” because the algorithm and the content it serves are “all the same thing.”
“This idea that we can distinguish between dangerous software and dangerous third-party content on software is in my mind an illusion,” he told Protocol. “The algorithm only directs people to see content. Ultimately, it's the content that's the problem. Then we're back to the fact that that's really a Sec. 230 lawsuit.”
Goldman also noted that the lawsuits against Meta, which were filed in Texas, Tennessee, Colorado, Delaware, Florida, Georgia, Illinois and Missouri, were spread out in the hopes that one of the eight judges overseeing the cases would side against the company. Goldman said what’s likely to happen if one judge rules in favor of Meta is that the rest will follow suit. The question remains if any judge will buy the plaintiff’s argument.
“Basically, it's like a lottery,” said Goldman. “You only really need to win one in order to open up a very, very big door for future litigation.”
Legislative moves
Regardless of whether Sec. 230 is in play, social media companies aren’t off the hook for platform addiction. A bill passed by the California State Assembly in late May would give parents the right to hold social media platforms responsible if their children become addicted.
Several platforms are also being investigated by attorneys general all over the country. In November, a coalition led by state AGs from California, Florida, Kentucky, Massachusetts, Nebraska, New Jersey, Tennessee and Vermont began investigating Instagram for the ways that it keeps young people engaged. The group extended its investigation to include TikTok in March.
Facebook whistleblower Frances Haugen also testified to Congress last year that Meta wasn’t forthcoming about Instagram’s effects on young people, even after internal research showed that the app exacerbated mental health issues for teen girls in particular.
“They include addictive design, features, algorithmic amplifications of disturbing content,” Common Sense Media CEO Jim Steyer told me. “Those are just some of the tactics that social media platforms like Meta use.”
Amid public pressure, social media companies seem to be showing signs of wanting to limit user addiction. TikTok announced Thursday that it’s rolling out more screen time controls to help users limit the amount of time they spend scrolling. Instagram implemented similar daily time limits, but quietly rolled back the ability for mobile users to set a daily time limit reminder lower than 30 minutes. Though these moves seem good on the surface, time limits like these are easy to surpass, and may just be a tactic for these companies to save face given that they want to keep users actively using the platform for as long as possible.
If states continue to pass legislation to hold social media platforms responsible for content posted on their platforms, Congress or the Supreme Court may step in to amend or clarify Sec. 230. That could reshape the way social media companies operate altogether.
“We're at a watershed moment, and in the next few years, we are finally going to see major action on multiple fronts,” Steyer said.