yesIssie LapowskyNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Politics

Facebook’s new research project will show how it influenced the 2020 election — after it's over

To study its impact on the 2020 election, Facebook will ask some people to stop using Facebook

Mark Zuckerberg standing in front of his own face on a screen

Facebook's CEO Mark Zuckerberg once said argued it was a "pretty crazy idea" to think Facebook could influence an election

Photo: Chip Somodevilla/Getty Images

Facebook is teaming up with academics across the country to determine once and for all whether Facebook is in fact influencing the 2020 election. The only catch: They won't know the answer until well after it's over.

The new research project, which Facebook announced Monday, will study how the 2020 election is playing out on the world's largest social network, and how the platform affects things like political polarization, voter participation, trust in democracy and the spread of misinformation. A 17-person research team, which includes leading academics in the fields of media and politics, will work with some two dozen Facebook staffers to design the experiments.

Once users opt in to be part of the study, the research team will deidentify their data, split them into groups and begin tinkering with their News Feeds, switch up their ad experiences and, in some cases, even ask them to stop using Facebook temporarily, all while surveying participants to see how their experiences and viewpoints evolve and stack up against control groups. The findings, which Facebook will have no veto power over, will be published for free to the public beginning next summer.

In some ways, the undertaking demonstrates how far Facebook has come since 2016, when it eagerly courted political clients with the promise of influence, then, following President Trump's victory, just as eagerly denied that it had any influence at all. Mark Zuckerberg himself famously called it a "pretty crazy idea." But the fact that this sort of research is only now getting underway also demonstrates just how little we actually know four years later. So much of what regulators and the public have come to believe about how social media affects elections boils down to anecdotes and assumption, not data. This research could change that.

"We're in these really uncharted waters where you have two or three enormous companies that control the vast majority of the data we need to advance science," said Joshua Tucker, a professor of politics at New York University, who is leading the outside research team along with Talia Stroud, a professor of communications at the University of Texas at Austin. "This has been just an amazing opportunity to work with a platform, and so we have to try that."

Facebook has a spotty record when it comes to social science research, particularly as it pertains to elections. In 2010, the company performed a randomized controlled trial on 61 million users to see whether serving different iterations of its "I Voted" sticker to different groups of users would impact voter turnout. (It did, sparking outrage.) And of course there was the Cambridge Analytica scandal, where a Cambridge University professor used a Facebook app to scrape data from tens of millions of Facebook users, which he then sold to the now defunct and disgraced data analytics firm that powered the Trump campaign.

This time around, Facebook is being significantly more cautious. Facebook employees will be the only ones with access to the raw data, for one thing, and users will have to explicitly opt-in to participate. That said, Stroud, Tucker and the 15 other researchers they've selected will have maximum input on the research questions and the experiment design. They will preregister their study plans so anyone interested can see exactly what the researchers are setting out to find before they even begin and can compare those goals to whatever they ultimately deliver. Facebook won't have any say over what the researchers do or don't publish, and the researchers won't be paid by Facebook, either.

"The academic team really thought about what the key questions might be, and then in collaboration with Facebook, we've been thinking through what the designs would be, with us bringing the expertise as far as methodology and the statistical techniques involved, and the Facebook team really bring a lot of expertise about what the platform is able to do," Stroud said.

The researchers plan to recruit about 200,000 to 400,000 people across the U.S. to participate. Those who opt in might see fewer political ads or fewer news stories about certain topics in their News Feeds. They might be asked to download apps that monitor their online behavior, though Facebook said researchers won't have access to private messages. They might even be asked to stop using Facebook altogether. Some who will act as a control group won't see any changes at all.

Both the researchers and the Facebook executives are confident the experiment group is so small that these tweaks won't have a meaningful outcome on the election. Of course, that might not stop political operatives from claiming that it will; some, like President Trump's digital director, Gary Coby, have already speculated without evidence that Facebook will intentionally register more Democrats than Republicans as part of its voter registration push.

But the potential insights may be well worth the reputational risks. Researchers and regulators have been pushing Facebook and other tech companies to be more transparent for years. In 2018, Facebook helped launch an organization called Social Science One, through which it has been working to open up data to third-party researchers. In February, as part of that work, Facebook released 38 million URLs that were shared on Facebook between 2017 and 2019. That tranche of data was the result of a painstaking process that took years to come to fruition, as Facebook and the researchers sparred over the best way to share data without sacrificing user privacy.

With this design, the researchers are taking a less intensive route, in order to ensure the research could actually get underway before November. "It's not the ideal system, but it is going to be unprecedented in the degree of research we will be able to accomplish," said Nate Persily, a professor at Stanford University School of Law and one of the co-chairs of Social Science One. "If we didn't do something related to the 2020 election, then that would have been a problem. This is the thing we can do."

Persily as well as Tucker and Stroud, who co-chair committees within Social Science One, hope that if this project is successful, it can serve as a model for other social media giants like YouTube and Twitter to open up their black boxes, too.

Of course, whatever the researchers do find about Facebook's impact on the election, it won't come in time for Facebook to act on it before November. But Facebook's head of research and transparency, Chaya Nayak, said such a huge undertaking wouldn't have been possible two years ago, because the company needed time to figure out how to make it work. "There's going to be other elections, and it's really important for us, as a company, to understand the impact of our platform on both the election coming up as well as elections moving forward," she said. "It's never too late."

From Your Site Articles

Twitter’s future is newsletters and podcasts, not tweets

With Revue and a slew of other new products, Twitter is trying hard to move past texting.

We started with 140 characters. What now?

Image: Liv Iko/Protocol

Twitter was once a home for 140-character missives about your lunch. Now, it's something like the real-time nerve center of the internet. But as for what Twitter wants to be going forward? It's slightly more complicated.

In just the last few months, Twitter has rolled out Fleets, a Stories-like feature; started testing an audio-only experience called Spaces; and acquired the podcast app Breaker and the video chat app Squad. And on Tuesday, Twitter announced it was acquiring Revue, a newsletter platform. The whole 140-characters thing (which is now 280 characters, by the way) is certainly not Twitter's organizing principle anymore. So what is?

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Microsoft wants to replace artists with AI

Better Zoom calls, simpler email attachments, smart iPhone cases and other patents from Big Tech.

Turning your stories into images.

Image: USPTO/Microsoft

Hello and welcome to 2021! The Big Tech patent roundup is back, after a short vacation and … all the things … that happened between the start of the year and now. It seems the tradition of tech companies filing weird and wonderful patents has carried into the new year; there are some real gems from the last few weeks. Microsoft is trying to outsource all creative endeavors to AI; Apple wants to make seat belts less annoying; and Amazon wants to cut down on some of the recyclable waste that its own success has inevitably created.

And remember: The big tech companies file all kinds of crazy patents for things, and though most never amount to anything, some end up defining the future.

Keep Reading Show less
Mike Murphy

Mike Murphy ( @mcwm) is the director of special projects at Protocol, focusing on the industries being rapidly upended by technology and the companies disrupting incumbents. Previously, Mike was the technology editor at Quartz, where he frequently wrote on robotics, artificial intelligence, and consumer electronics.

The Capitol riots scrambled FCC Republicans’ Section 230 plans. What now?

The FCC's top tech agitators have been almost silent about Big Tech's Trump bans.

The commissioners will gingerly walk a line of condemning the tech platforms without seeming like they are condoning the rhetoric that led to Trump's suspensions or the takedown of Parler.

Photo: Jonathan Newton-Pool/Getty Images

Brendan Carr, one of the Federal Communications Commission's two Republicans, spent the better part of 2020 blasting Big Tech platforms for allegedly censoring conservative speech, appearing on Fox News and right-wing podcasts to claim that social media companies exhibited bias against President Trump and the GOP more broadly.

But in the weeks since Twitter, Facebook and YouTube suspended former President Trump and removed large swaths of his supporters in the wake of the violent riot on Capitol Hill, Carr has remained largely silent about the deplatforming, except to condemn the violence. "Political violence is completely unacceptable," Carr told reporters days after the riot. "It's clear to me President Trump bears responsibility."

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

Politics

Facebook’s Oversight Board won’t save it from the Trump ban backlash

The Board's decision on whether to reinstate Trump could set a new precedent for Facebook. But does the average user care what the Board has to say?

A person holds a sign during a Free Speech Rally against tech companies, on Jan. 20 in California.

Photo: Valerie Macon/Getty Images

Two weeks after Facebook suspended former President Donald Trump's account indefinitely, Facebook answered a chorus of calls and referred the case to its newly created Oversight Board for review. Now, the board has 90 days to make a call as to whether Trump stays or goes permanently. The board's decision — and more specifically, how and why it arrives at that decision — could have consequences not only for other global leaders on Facebook, but for the future of the Board itself.

Facebook created its Oversight Board for such a time as this — a time when it would face a controversial content moderation decision and might need a gut check. Or a fall guy. There could be no decision more controversial than the one Facebook made on Jan. 7, when it decided to muzzle one of the most powerful people in the world with weeks remaining in his presidency. It stands to reason, then, that Facebook would tap in its newly anointed refs on the Oversight Board both to earnestly review the call and to put a little distance between Facebook and the decision.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.

Big Tech gets a win from Biden’s sweeping immigration actions

Tim Cook and Sundar Pichai praised President Biden's immigration actions, which read like a tech industry wishlist.

Newly-inaugurated President Joe Biden signed two immigration-related executive orders on Wednesday.

Photo: Chip Somodevilla/Getty Images

Immediately after being sworn in as president Wednesday, Joe Biden signed two pro-immigration executive orders and delivered an immigration bill to Congress that reads like a tech industry wishlist. The move drew enthusiastic praise from tech leaders, including Apple CEO Tim Cook and Alphabet CEO Sundar Pichai.

President Biden nullified several of former-President Trump's most hawkish immigration policies. His executive orders reversed the so-called "Muslim ban" and instructed the attorney general and the secretary of Homeland Security to preserve the Deferred Action for Childhood Arrivals, or DACA, program, which the Trump administration had sought to end. He also sent an expansive immigration reform bill to Congress that would provide a pathway to citizenship for undocumented individuals and make it easier for foreign U.S. graduates with STEM degrees to stay in the United States, among other provisions.

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

Latest Stories