Facebook has often been accused of failing to anticipate problems of grave international importance until it's too late. When it comes to election night 2020, it's at least trying not to repeat that mistake.
On Wednesday, the company explained how it plans to approach what is bound to be a historically chaotic election in the United States. A deluge of mail-in ballots are expected to create delays in announcing a winner and the president of the United States is already indicating he may not accept the results anyway. Facebook's plans to address this include adding notifications at the top of users' feeds and imposing a quiet period on political ads after the election.
"This is a very unique election," Guy Rosen, Facebook's vice president of integrity, said on a call with reporters. "It's really important for us to make sure that we're putting all eyes on this."
To surface reliable information around the election, Facebook will rely heavily on its Voting Information Center, a relatively new section of the platform that includes information about the election results. When polls close, the company said it will run notifications at the top of both Instagram and Facebook, directing people to the center. If any presidential candidate declares victory before "major media outlets" do, however, Facebook plans to label their posts, specifying that votes are still being counted.
If the opposite happens, and media outlets declare a victory, but the outcome is contested by one of the candidates, the company plans to run notifications at the top of Facebook and Instagram with the winning candidate's name and a link to the Voting Information Center. That label will also be applied to posts from presidential candidates.
To surface reliable information around the election, Facebook will rely heavily on its Voting Information Center.Image: Facebook
In addition to these warnings, the company announced it will ban all ads related to social issues, politics and elections after polls close on Nov. 3. Facebook anticipates that the ban could last for a week, but is subject to change. According to Rosen, the goal of that change is to "reduce opportunities for any confusion or abuse."
Previously, Facebook said it would prohibit any new political or issue ads in the week leading up to the election, a tweak some viewed as too minor to matter. The wholesale ban after Election Day will pack more of a punch, but it could also end up privileging candidates who have a bigger organic following, like President Trump does.
Though it wont to admit it, it's clear that much of Facebook's planning has to do with concerns over President Trump's behavior both before and after the election. During last week's debate, the president called for the white supremacist group The Proud Boys to "stand by," which the group took as a call to arms. Now, Facebook is announcing that in addition to its existing policies prohibiting calls for bringing weapons to polling places or actively interfering in voting, it's also going to remove posts that call for poll watching using militarized language, or in a way that would intimidate or attempt to display power over election officials and voters.
"Recently we've seen speech that has been more implicit in a number of different areas with our policies," Monika Bickert, Facebook's developer of policy enforcement, said in response to a question about President Trump's comments. "The civil rights auditors and the civil rights community members that we talk to on a regular basis have really helped us track some of these trends. It's a very adversarial space, of course, and we anticipate that, as we have updated these lines, those who are seeking to get around them will try to use new language. So that's something we try to stay on top of."
Bickert said this policy will not be retroactive, which means it won't be applied to a recent post by the Trump campaign, in which Donald Trump Jr. calls for an "ARMY FOR TRUMP's election security operation!" If the campaign were to post a similar message going forward, however, Bickert said it would be removed.
Facebook's awareness of the role it plays in elections has evolved dramatically since the 2016 race, when the company's primary goal was to sell as many political ads as it could. Since then, it's curbed political advertising somewhat by forcing would-be advertisers to go through a verification process and created a library of ads that the company says more than 2 million people visit every month. Now, rather than touting the amount of money it's made from political ads, Facebook is taking a victory lap for how many ads it's blocked — 2.2 million — for failing to go through the verification process.
The company now meets often with its counterparts in tech, voting rights experts and government officials to game out potential threats to elections. It monitors viral posts that risk violating its policies, even if they haven't been reported by a user or flagged by its automated systems. And it regularly finds and removes the kind of coordinated campaigns that Russian trolls used to interfere with the 2016 election — an issue that was scarcely on the company's radar in 2016. This, the company's head of cybersecurity policy Nathaniel Gleicher said on the call, is "one of the biggest differences between the 2016 election and today."
"In 2016, Russian actors' deception campaigns were exposed in the months after the election," Gleicher said. "Today they're getting caught and taken down months, and in some cases, more than a year in advance." So far, Facebook has removed more than 100 such campaigns.
Facebook has also undertaken a voter registration drive that the company says has helped some 2.5 million people register to vote so far this year.
But in so many other ways, critics say the company is still doing too little, too late. The labels it applies to posts questioning the legitimacy of voting and mail-in ballots have been panned for being vague and at times, even confusing. (Recently, the company added a new label to one of President Trump's posts about mail-in voting, more directly contradicting his claims.) It's refused to fact-check politicians' false statements, except on a narrow set of issues. And it's slow-walked its ban on dangerous communities like QAnon, a conspiracy theory group that has gained traction on social media and is now working its way into mainstream politics. In August, the company announced it would ban violent content associated with the group. But it wasn't until just this week that Facebook announced it would ban QAnon content altogether.
Even as it addresses issues it's seen pop up in the past, it's facing brand new challenges that may prove just as complex to solve. Gleicher, for one, warned that while the company has gotten better at detecting coordinated inauthentic activity, those same bad actors are now creating their own media outlets, hiring contributors and attempting to feed their stories to "unwitting news organizations," a technique he referred to as "perception hacking."
"As it gets harder and harder to run large scale social media operation campaigns because they're getting caught, they're trying instead to play on our fears," Gleicher said. "Why run a large campaign that will get you caught when you can try and trick people into thinking such a campaign is happening?"
What seems clear is that no matter how hard Facebook tries to prepare and correct the record around the election, this year, the company will be up against a sitting president with a giant microphone and a propensity for spouting misinformation. President Trump and his fellow conservatives have already spent years convincing their supporters that Big Tech is biased against them. Slapping a warning label on his potential declaration of victory hardly seems likely to convince them otherwise.