U.S. election infrastructure is exceedingly secure, and voter fraud here is so rare it’s comparable to your annual chances of getting struck by lightning. Despite this, former President Donald Trump and a long list of allies in the Republican Party have spent the last two years questioning the overall integrity of the U.S. election system. Many of those allies are now candidates themselves, and their coordinated attack on the country’s status as a democracy is not a relic of 2020. Some have already started repeating these “Big Lie” charges ahead of next week’s midterms. And the social platforms that help them spread their message have prepared few measures to stop it.
In short, many of the efforts from companies — including Twitter, Meta, and YouTube — to protect 2022’s elections look a lot like the measures the platforms took in 2020.
- The platforms had made (some) genuine progress on the threats from 2016 or so. Many of their policies, for instance, focus on coordinated inauthentic behavior, like foreign botnets and Chinese or Russian interference campaigns.
- Social networks have also put pretty expansive rules in place around lies that would stop people from voting — such as when polls close or who is eligible to cast a ballot.
- The services have also tried to plug some remaining holes with resource pages posting accurate information, limits on ads about political topics, or more invoking of policies forbidding certain types of harmful misinformation.
But many lies about the security of the whole system and the reliability of the general results still don’t fall under these policies, and such content often slips through moderation nets because it’s not clear what rules apply.
Twitter could be making the problem worse, especially given Elon Musk gutting half the company’s staff in the last 24 hours.
- In terms of policy, Twitter actually goes a little further than some of its peers, limiting “misleading claims intended to undermine public confidence in an election.”
- Musk also punted on bringing Trump back onto the platform for a few weeks, pushing off any decision until after the election.
- The company, though, had its hands full enforcing its rules even before the cuts.
- Its coming changes to verification will also allow bad actors to pay just a few bucks to pose as reliable sources of information and further the flood of lies, particularly during what’s expected to be slow counting and certification of results.
Meta seems to have mostly recycled its 2020 playbook, despite reporting that suggested the company’s three platforms were particularly helpful in supercharging the original Big Lie — focused on Biden’s election — in the leadup to Jan. 6.
- In laying out its plans for the midterms, Meta said it might append labels to “content discussing the integrity of the election.”
- The parent company of Facebook, Instagram, and WhatsApp said users disliked the volume of labels it applied last time around, though, so it suggested any labeling that does occur will only happen on posts that reach a certain level of virality.
Other platforms’ approaches don’t necessarily inspire confidence, either.
- YouTube, in addition to the usual bans on incitement and lies about the voting process, prohibits “false claims that widespread fraud, errors, or glitches occurred in certain past elections to determine heads of government.” The company said it has also taken down such claims about 2020.
- That sounds like a policy tailor-made for confronting the new Big Lie with its firmness and specificity — except it doesn’t apply to claims about 2022, which is both current and a contest when the president isn’t on the ballot.
- TikTok actually says it will go so far as to remove misinformation that undermines “public trust in civic institutions and processes such as … elections,” although its record is thin and staff turnover may threaten its efforts even as it becomes a major news source.
- Reddit’s policies seem to focus on detecting attempts to intimidate voters or suppress turnout (like lies about when polls close), as well as the site’s usual ban on calls to violence.
Many Republicans will win election contests fairly next week. It’s quite possible there’ll be enough of them that they’ll take control in one or both chambers in Congress. There’s real risk, though, that some will seek to overturn legitimate losses — or even that a few Democrats will sense an opening for bad behavior — by fostering doubts about whether the U.S. can still pull off real elections. The social networks seem mostly to be hoping they have the tools to tackle that. It’s not clear they do.
A version of this story appeared in Protocol’s Policy newsletter. Sign up here to get it in your inbox three times a week.