A selection of apps as displayed on an iPhone screen against a solid yellow background.
Image: Rahul Chakraborty

Platforms’ new abortion dilemma in Texas

Protocol Policy

Hello, and welcome to Protocol Policy! Today, we’re talking about how two of Texas’ most controversial laws — its social media “censorship” law and its abortion ban — conflict. Plus, inside Elon Musk’s meetings with Twitter execs. and Mike Bloomberg spends big to stop big coal.

Double trouble in Texas

Texas’ social media “censorship” law, HB 20, is disastrous for tech companies for innumerable reasons. It’s why the tech groups fighting the legal battle against it — NetChoice and CCIA — scrambled to ask the Supreme Court to stop the law in its tracks just days after the Fifth Circuit allowed it to go into effect. Texas has to file its reply today, after which it’ll be up to the Supreme Court to decide what happens next.

But if the justices don’t side with tech companies, HB 20 could wind up running headlong into Texas’ other most controversial recent law: its de facto abortion ban, SB 8.

A little refresher on SB 8: Even before the Supreme Court’s decision that would overturn Roe v. Wade leaked earlier this month, Texas’s SB 8 effectively prohibited abortions in the state by allowing anyone to sue anyone else for providing or “aiding and abetting” an abortion in Texas after six weeks.

  • The broad definition of “aiding and abetting” means even something as far removed as offering information about abortion access or giving someone a ride to an abortion clinic risks violating the law.
  • Lyft and Uber take that risk so seriously, they promised to cover drivers’ legal fees if they get sued.

Now, about HB 20. That law prohibits social media platforms from moderating content on the basis of “viewpoint.” But it also requires them to evaluate potentially illegal content within 48 hours of receiving notice of its existence.

  • That creates a nearly unanswerable question for platforms: Does a post about abortion access in Texas merely constitute a viewpoint that must, under HB 20, remain on the platform? Or is it, under SB 8, considered illegal content that platforms would be wise to remove?
  • “The two laws are somewhat at cross-purposes, no doubt unintentionally,” said Mary Anne Franks, a professor at the University of Miami School of Law and president of the Cyber Civil Rights Initiative. “Where SB 8 seeks to reduce speech the government of Texas doesn’t like, HB 20 seeks to increase speech that the government does like.”
  • “The services must make laser-precise determinations of whether content is legal or illegal, with potential liability for making mistakes in either direction, even if in good faith,” said Eric Goldman, a professor at Santa Clara University School of Law.

HB 20 doesn’t require social media companies to actually remove illegal content. It only requires that they set up a system to evaluate its legality. Section 230 should protect platforms for anything they fail to remove. But legal experts argue famously risk-averse major platforms may think it wise to get rid of anything that might be perceived as illegal anyway.

  • That could manifest in a bunch of alarming ways, from platforms removing content when someone flags it to platforms automatically filtering posts related to abortion access to platforms actually reporting users to the police, said Daphne Keller, director of the Program on Platform Regulation at Stanford's Cyber Policy Center.
  • “That sounds full Gilead,” Keller said, “but is what happens now for [child sexual abuse material in the U.S.] and terrorism and a slough of proposed other categories outside the U.S.”

All of this could lead to what David Greene, senior staff attorney at the Electronic Frontier Foundation, called “a great information asymmetry, in which companies are required to publish anti-abortion speech but forbidden from, or at least substantially chilled from, publishing abortion rights information.”

As bad as this outcome could be for platforms, it’s something like a worst-case scenario for abortion rights advocates — and anyone in Texas who might someday seek an abortion. As if their future wasn’t bleak enough.

— Issie Lapowsky (email | twitter)

In Washington

DHS is hitting pause on its Disinformation Governance Board, according to The Washington Post. The board’s future is now in the hands of the Homeland Security Advisory Council. The board’s director, Nina Jankowicz — who has been on the receiving end of a deluge of harassment since the board was announced — has resigned.

The State Department is launching a Conflict Observatory to document the war in Ukraine. The new data-gathering body will use open-source tools and satellite imagery to document what’s happening on the ground and collect evidence of war crimes. The group plans to publicly share what it collects on an online platform “to help refute Russia’s disinformation efforts and shine a light on abuses.”

CISA has installed tools that give it visibility into hacking threats across 15 U.S. agencies. It’s now installing the tool at another 11 agencies, expanding CISA’s visibility of vulnerabilities across the government. These upgrades come in part as a response to the SolarWinds hack.

Elon Musk may be spared a fresh look by the SEC, experts predict. Even though Musk won’t stop (literally) talking shit about Twitter, his concerns about bots may not constitute “materially false and misleading statements.” “I just don’t think there’s any evidence of that,” former SEC enforcement lawyer Jacob Frenkel told The Washington Post.

On Protocol

Mike Bloomberg is spending nearly $242 million to help put an end to coal use in 10 countries. This investment follows Bloomberg’s ongoing efforts at home, including a $500 million investment aimed at closing every coal-fired power plant in the U.S.

Companies’ plans to cover abortion travel for workers if Roe is overturned raise all kinds of tricky legal and cultural questions. Not least of which: Can companies really keep that information private? “Once you criminalize the activity, it opens up so much more leeway for law enforcement and prosecutors to launch investigations,” Alejandra Caraballo of Harvard Law School’s Cyberlaw Clinic told Protocol.

A MESSAGE FROM CHAMBER OF PROGRESS

New polling shows that American voters do not see regulating tech companies as a priority. Their top concerns are strengthening the national economy (38%), followed by controlling inflation (37%). By contrast, only 5% of respondents prioritized regulating tech companies.

Learn more

Around the world

Russian hackers are behind a devastating cyberattack in Costa Rica. The hacker group, Conti, took credit for the April attack that recently prompted the country to declare a state of national emergency.

In the media, culture and metaverse

Marc Andreessen says we’re living in a “vetocracy” — that is, “a set of systems in which a lot of people have the ability to say no.” During a tech conference, Andreessen said this system of governance has cost the U.S. its ability to execute on audacious ideas.

Disney+ won’t accept political ads when it starts rolling out commercials. That’s bound to make it tough to reach that coveted demo of voters who are actually just three 6-year-olds in a trench coat.

Nick Clegg wrote a long (long) essay making the case for the metaverse. It covers a lot of ground, but here’s what Clegg has to say about how the rules of the metaverse ought to be written: “Collectively, we can think of this process as developing a system of governance for the metaverse. And it mustn’t be shaped by tech companies like Meta on their own. It needs to be developed openly with a spirit of cooperation between the private sector, lawmakers, civil society, academia, and the people who will use these technologies.”

In the C-suite

The great Twitter exodus continues. The company lost three more executives this week, including Ilya Brown, vice president of Product Management; Katrina Lane, vice president of Twitter Service; and Max Schmeiser, head of Data Science.

Musk met with top Twitter executives for three days before he announced his plan to buy Twitter outright. Parag Agrawal, Bret Taylor and Jack Dorsey were all part of those discussions, according to an SEC filing. And for anyone still wondering, it also lays to rest the rumor that Musk refused a background check when he was angling to join the board (not to say we told you so).

In data

92%: That’s how much insurance premiums for cyber coverage grew in 2021, compared to the year before. Insurers are also requiring companies applying for coverage to prove that they’re at least trying to practice good cyber hygiene.

A MESSAGE FROM CHAMBER OF PROGRESS

New polling shows voters' top tech policy concerns are cybersecurity and data privacy. Only 7% of respondents prioritized antitrust action and 1% prioritized changes to app store rules. In fact, the majority (58%) believe the pending tech antitrust legislation would cause more harm than help to consumers.

Learn more

If it talks like a bot and tweets like a bot …

Elon Musk may not like bots, but according to Botometer, he practically is one. The online tool, which analyzes bot behavior, gives Elon’s account a high likelihood of being a bot. As far as we know, Musk is not actually a bot, but come to think of it, it would make a lot of things make a lot more sense.

Thanks for reading — see you Friday!

Recent Issues