Why Twitch’s 'hate raid' lawsuit isn’t just about Twitch

When is it OK for tech companies to unmask their anonymous users? And when should a violation of terms of service get someone sued?

Phone with Twitch app

The case Twitch is bringing against two hate raiders is hardly black and white.

Photo: Caspar Camille Rubin/Unsplash

It isn't hard to figure out who the bad guys are in Twitch's latest lawsuit against two of its users. On one side are two anonymous "hate raiders" who have been allegedly bombarding the gaming platform with abhorrent attacks on Black and LGBTQ+ users, using armies of bots to do it. On the other side is Twitch, a company that, for all the lumps it's taken for ignoring harassment on its platform, is finally standing up to protect its users against persistent violators whom it's been unable to stop any other way.

But the case Twitch is bringing against these hate raiders is hardly black and white. For starters, the plaintiff here isn't an aggrieved user suing another user for defamation on the platform. The plaintiff is the platform itself. Complicating matters more is the fact that, according to a spokesperson, at least part of Twitch's goal in the case is to "shed light on the identity of the individuals behind these attacks," raising complicated questions about when tech companies should be able to use the courts to unmask their own anonymous users and, just as critically, when they should be able to actually sue them for violating their speech policies.

"Normally, what happens when someone violates your terms of service is you boot them off the service," said Aaron Mackey, senior staff attorney for the Electronic Frontier Foundation. "This lawsuit, while ostensibly good because it's aimed at bad actors who have done pretty awful things, is an escalation."

A Twitch spokesperson emphasized that the company "respect[s] anonymity" and that Twitch is using "all of our tools in the tool kit to stop the attack — including a legal claim."

"Our intent is not to reveal the actors' names but to identify these individuals for law enforcement or to compel them to cease attacking our community," the spokesperson said.

Tech companies sue their own users all the time for violating rules against spam, selling counterfeit goods and even data scraping. The Twitch suit has some of that, taking aim at the two anonymous users — known as CruzzControl and CreatineOverdose — for allegedly using bots to assault users with slurs and other offensive speech. But it's also coming after these users for the offensive speech itself, which Twitch argues violates its policies against "hateful conduct," an important but ever-evolving category of violation that companies themselves still struggle to define.

"It's not new for online service providers such as Twitch to sue their own users," said Riana Pfefferkorn, a research scholar at the Stanford Internet Observatory, who herself brought a case against spammers on behalf of Twitter while working in private practice. "What I find novel is that the purpose of the large-scale, bot-enabled violation of the platform's policies was spewing hate at other users, rather than spammy behavior or phishing."

That signals an important recognition by Twitch that hateful behavior on its platform can be as damaging to users' experience as spam or other bad behavior, if not more. And trying new ways to punish that behavior is laudable.

And yet, taking users to court over violations of terms of service has been particularly contentious in other contexts. Academics and civil rights groups, for instance, have fought back against tech platforms' crackdowns on scraping, arguing that data scraping, which lots of companies consider a privacy violation, is actually an important research method. The Supreme Court recently voted against the Department of Justice in a case where the DOJ argued that breaching terms of service constituted a violation of the Computer Fraud and Abuse Act.

Suing over violations of terms of service can seem virtuous when the defendants are so — well, not. But, Mackey said, "I think there could be larger implications to what's happening here."

Twitch's spokesperson noted that "anyone is subject to a lawsuit under a Terms of Service given that it is treated as a contract." And that's true. It's just that when it comes to hate speech issues, those lawsuits by companies are vanishingly rare.

Asking a court to intervene in identifying an anonymous person is also not a trivial thing. The defendants' actions in this case may be detestable, but there are any number of other good reasons for people's anonymity to be protected both online and off. That's why courts have long recognized that the First Amendment establishes a right to anonymity.

But while that protection is strong, it's "not absolute," said Jeff Kosseff, associate professor of cybersecurity law in the United States Naval Academy, who is writing a book on anonymity. "It comes down to looking at the strength of the case," Kosseff said.

Courts have developed a framework to determine when it is reasonable to issue subpoenas to unmask anonymous figures. That framework depends on things like whether the plaintiffs have a legitimate claim and whether they can show a necessity for the information. Other tests courts have used include assessing whether the information can be attained anywhere else or whether the harm of unmasking outweighs the need to unmask.

"The test is designed to not be absolute, but to be flexible, to allow for certain situations where perhaps someone doesn't have to be publicly identified in a court filing, but you could use their information," Mackey said.

Typically the plaintiffs in these cases are users seeking subpoenas that could compel online service providers to give up other users' identities. "The twist here is that the service provider is also the plaintiff," Pfefferkorn said.

Of course, there's also a case to be made that the speech these users have engaged in, which included "racial slurs and descriptions of violent acts against racial minorities and members of the LGBTQIA+ community," according to the complaint, is harassment, which is not protected by the First Amendment. "I think those are going to be tricky questions," Mackey said. "Do these folks have a First Amendment right to engage in their expression?"

Twitch argues they do not. "In this case, the actors are not using anonymity to conduct speech or expression activity or to participate in our community, but rather to deliberately attack our community, break our terms of service, and obfuscate their methods," the spokesperson said.

According to what information Twitch does have about the users, they appear to be outside of the United States. Foreign defendants are entitled to First Amendment protections in U.S. courts, but as Pfefferkorn put it, "the wheels of justice against overseas defendants move even more slowly than they do when everyone involved is located in the U.S."

The big question is what kind of precedent this case could set for other tech platforms that are also trying to figure out how to stop targeted harassment from their users. Hate raids may be a phenomenon associated with Twitch, but targeted hate speech is a problem all over the internet. Kosseff is dubious the case will open the floodgates to mass litigation, if it even goes forward. For one thing, it's expensive, and for another, it's not a great look for platforms to sue average users for slight offenses. The defendants in this case, Kosseff argues, happen to be particularly odious.

There's also no guarantee these defendants' identities can even be found, if they were really diligent about covering their tracks. "[Twitch] presumably has some sort of IP address, it could be a Tor exit node, in which case, it's useless," Kosseff said. He also pointed out Twitch filed another lawsuit against anonymous trolls in 2019, which it voluntarily dismissed a year later.

Even if the case doesn't go forward, though, the threat of this type of suit is, on its own, a signaling exercise. "Twitch is ready, willing and able to expend significant resources against hate-spewing abusers on its service," Pfefferkorn said. "If having an account terminated for terms of service violations doesn't scare the people who harass and victimize others on Twitch, perhaps the specter of being sued in federal court will."


2- and 3-wheelers dominate oil displacement by EVs

Increasingly widespread EV adoption is starting to displace the use of oil, but there's still a lot of work to do.

More electric mopeds on the road could be an oil demand game-changer.

Photo: Humphrey Muleba/Unsplash

Electric vehicles are starting to make a serious dent in oil use.

Last year, EVs displaced roughly 1.5 million barrels per day, according to a new analysis from BloombergNEF. That is more than double the share EVs displaced in 2015. The majority of the displacement is coming from an unlikely source.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Sponsored Content

Foursquare data story: leveraging location data for site selection

We take a closer look at points of interest and foot traffic patterns to demonstrate how location data can be leveraged to inform better site selecti­on strategies.

Imagine: You’re the leader of a real estate team at a restaurant brand looking to open a new location in Manhattan. You have two options you’re evaluating: one site in SoHo, and another site in the Flatiron neighborhood. Which do you choose?

Keep Reading Show less

The limits of AI and automation for digital accessibility

AI and automated software that promises to make the web more accessible abounds, but people with disabilities and those who regularly test for digital accessibility problems say it can only go so far.

The everyday obstacles blocking people with disabilities from a satisfying digital experience are immense.

Image: alexsl/Getty Images

“It’s a lot to listen to a robot all day long,” said Tina Pinedo, communications director at Disability Rights Oregon, a group that works to promote and defend the rights of people with disabilities.

But listening to a machine is exactly what many people with visual impairments do while using screen reading tools to accomplish everyday online tasks such as paying bills or ordering groceries from an ecommerce site.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.


The crypto crash's violence shocked Circle's CEO

Jeremy Allaire remains upbeat about stablecoins despite the UST wipeout, he told Protocol in an interview.

Allaire said what really caught him by surprise was “how fast the death spiral happened and how violent of a value destruction it was.”

Photo: Heidi Gutman/CNBC/NBCU Photo Bank/NBCUniversal via Getty Images

Circle CEO Jeremy Allaire said he saw the UST meltdown coming about six months ago, long before the stablecoin crash rocked the crypto world.

“This was a house of cards,” he told Protocol. “It was very clear that it was unsustainable and that there would be a very high risk of a death spiral.”

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

A DTC baby formula startup is caught in the center of a supply chain crisis

After weeks of “unprecedented growth,” Bobbie co-founder Laura Modi made a hard decision: to not accept any more new customers.

Parents unable to track down formula in stores have been turning to Facebook groups, homemade formula recipes and Bobbie, a 4-year-old subscription baby formula company.

Photo: JIM WATSON/AFP via Getty Images

The ongoing baby formula shortage has taken a toll on parents throughout the U.S. Laura Modi, co-founder of formula startup Bobbie, said she’s been “wearing the hat of a mom way more than that of a CEO” in recent weeks.

“It's scary to be a parent right now, with the uncertainty of knowing you can’t find your formula,” Modi told Protocol.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Latest Stories