yesJohn AckerlyNone

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy


We can't cure COVID-19 by giving up our right to privacy

"Having served as lead White House technology adviser on Sept. 11, 2001, I've been down this path, and I know its peril far too well."

John Ackerly at his Virtu office

Former White House technology adviser John Ackerly says that "we have an opportunity to learn from the past and make smart decisions that can both accelerate 'reopening' society and also enhance our civil liberties and the public trust."

Photo: Benjamin Lowy/Getty Images

As fast as we've gone from patient zero to pandemic in this country, so have we sped from the mere suggestion of "sentinel surveillance" to our leaders demanding what some call a Patriot Act for health care. Having served as lead White House technology adviser on Sept. 11, 2001, I've been down this path, and I know its peril far too well. But before I go there, let's first take stock of where we are.

While there are some early indications that social distancing is working, the American acceptance of isolation — and the economic uncertainty attached to it — is wearing thin. Any positive signs will likely ignite anxious anticipation of going back to life as we knew it. But when we do open the economy back up, we must do so without trading in our values.

Plans such as the "National Coronavirus Response: A Roadmap to Reopening" lay out comprehensive phased approaches to containing the pandemic and ultimately opening society back up. In order to move out of this necessarily first isolation phase, they also argue for the creation of "new national surveillance infrastructure" and the widespread use of individual contact tracing.

Indeed, a system that tracks individual movements and health conditions in real time would accelerate the removal of physical distancing requirements. South Korea used this kind of tracing to successfully slow the spread of coronavirus there, and China's more draconian version also appears to have worked.

However, such surveillance systems clearly conflict with American values and the right to privacy. In the U.S., numerous "smart" tracking efforts have popped up to inform the public, such as Unacast's Social Distancing Scoreboard or Kinsa's fever heat map, with new apps released almost daily. The information is valuable, but much of it is powered by our private data, which lacks critical protections and which we have no ability to control.

The question is: How do we serve the critical need while preserving the inalienable right?

In my time at the White House, I bore witness to a number of the early decisions that had significant, unintended negative consequences. For example, new FISA court processes allowed domestic wiretapping, and Patriot Act provisions enabled mass data capture of citizens who were not under any criminal investigation. The lessons many of us learned were that the benefits in the fight against terrorism were marginal, while the privacy and data security consequences were large and lasting. Further, we learned that because there was little transparency in the deliberations, people lost trust in public institutions.

Today, we have an opportunity to learn from the past and make smart decisions that can both accelerate "reopening" society and also enhance our civil liberties and the public trust. Surveillance sentinel systems and contact tracing will be mission-critical for a sustainable solution to COVID-19, but they must not evoke fears of "1984" or "The Minority Report," much less "Citizenfour."

Data privacy must be a foundational and transparent component of our response to COVID-19. We must lead the world in innovative strategies that enhance both safety and trust. The solution demands giving the American public actual technical control over their data so that people don't have to rely on the promises of technology companies or the government for how they will use (and reuse) personal data.

The good news is that recent technology advances make possible near-instant contact tracing in a way that protects privacy and enhances actionable intelligence. We have the capability today to create an open, interoperable and encrypted data-sharing and surveillance platform that empowers each of us to be a sentinel — to share our most sensitive data for both individual and societal benefit, while at the same time knowing that we can ultimately control the scope and timespan of usage of our own data. I know this because I run a company that makes such a platform, but our tools are open-source and they and other open-source offerings can be used for free by governments and any of the emerging contact tracing efforts (e.g., the World Health Organization, MIT and scores of regional initiatives).

In February, Secretary of Health and Human Services Alex Azar called for "a completely different kind of health care system: where you, as the patient, are at the center and in control, with seamless access to the data you need to make decisions." This vision could not be more urgent, and I call on the White House immediately to convene the CDC, CMS and our country's leading epidemiologists, technologists and privacy experts to develop an open, secure data-sharing and surveillance framework that actually puts the public at the center of control.

In so doing, we can regain the moral authority lost after 9/11 and help lead the world out of this crisis and prevent the next. Privacy is a human right, grounded in trust, and there is no sustainable solution to COVID-19 without it. Today, only 35% of Americans are comfortable with sharing data with government to fight the virus. This makes sense, particularly given the continuing, daily examples of corporate and government data leaks, misuse and abuse.

That's why we must move quickly and aggressively to adopt open and interoperable surveillance platforms that verifiably demonstrate that data is used only for its intended purpose and that give individuals the power to approve or revoke access to their sensitive information at any time. With smart technology and policy decisions now, we will have the confidence to freely share our most sensitive data for the common good, reopen our economy more quickly, and get back to the life and freedoms that make us Americans.


Beeper built the universal messaging app the world needed

It's an app for all your social apps. And part of an entirely new way to think about chat.

Beeper is an app for all your messaging apps, including the hard-to-access ones.

Image: Beeper

Eric Migicovsky likes to tinker. And the former CEO of Pebble — he's now a partner at Y Combinator — knows a thing or two about messaging. "You remember on the Pebble," he asked me, "how we had this microphone, and on Android you could reply to all kinds of messages?" Migicovsky liked that feature, and he especially liked that it didn't care which app you used. Android-using Pebble wearers could speak their replies to texts, Messenger chats, almost any notification that popped up.

That kind of universal, non-siloed approach to messaging appealed to Migicovsky, and it didn't really exist anywhere else. "Remember Trillian from back in the day?" he asked, somewhat wistfully. "Or Adium?" They were the gold-standard of universal messaging apps; users could log in to their AIM, MSN, GChat and Yahoo accounts, and chat with everyone in one place.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.


Amazon’s head of Alexa Trust on how Big Tech should talk about data

Anne Toth, Amazon's director of Alexa Trust, explains what it takes to get people to feel comfortable using your product — and why that is work worth doing.

Anne Toth, Amazon's director of Alexa Trust, has been working on tech privacy for decades.

Photo: Amazon

Anne Toth has had a long career in the tech industry, thinking about privacy and security at companies like Yahoo, Google and Slack, working with the World Economic Forum and advising companies around Silicon Valley.

Last August she took on a new job as the director of Alexa Trust, leading a big team tackling a big question: How do you make people feel good using a product like Alexa, which is designed to be deeply ingrained in their lives? "Alexa in your home is probably the closest sort of consumer experience or manifestation of AI in your life," she said. That comes with data questions, privacy questions, ethical questions and lots more.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Why Biden needs a National Technology Council

The U.S. government needs a more tightly coordinated approach to technology, argues Jonathan Spalter.

A coordinated effort to approach tech could help the White House navigate the future more easily.

Photo: Gage Skidmore/Flickr

The White House has a National Security Council and a National Economic Council. President-elect Joe Biden should move quickly to establish a National Technology Council.

Consumers are looking to the government to set a coherent and consistent 21st century digital policy that works for them. Millions of Americans still await public investments that will help connect their remote communities to broadband, while millions more — including many families with school-age children — still struggle to afford access.

Keep Reading Show less
Jonathan Spalter
Jonathan Spalter is the president and CEO of USTelecom – The Broadband Association.

We need Section 230 now more than ever

For those who want to see less of the kind of content that led to the storming of the Capitol, Section 230 may be unsatisfying, but it's the most the Constitution will permit.

Even if certain forms of awful speech could be made unlawful, requiring tech sites to clean it up would be even more constitutionally difficult.

Photo: Angel Xavier Viera-Vargas

Many conservatives are outraged that Twitter has banned President Trump, calling it "censorship" and solemnly invoking the First Amendment. In fact, the First Amendment gives Twitter an absolute right to ban Trump — just as it protects Simon & Schuster's right not to publish Sen. Josh Hawley's planned book, "The Tyranny of Big Tech."

The law here is clear. In 1974, the Supreme Court said newspapers can't be forced to carry specific content in the name of "fairness," despite the alleged consolidation of "the power to inform the American people and shape public opinion." The Court had upheld such Fairness Doctrine mandates for broadcasters in 1969 only because the government licenses use of publicly owned airwaves. But since 1997, the Court has held that digital media enjoys the same complete protection of the First Amendment as newspapers. "And whatever the challenges of applying the Constitution to ever-advancing technology," wrote Justice Antonin Scalia in 2011, "'the basic principles of freedom of speech and the press, like the First Amendment's command, do not vary' when a new and different medium for communication appears."

Keep Reading Show less
Berin Szóka

Berin Szóka (@BerinSzoka) is president of TechFreedom (@TechFreedom), a technology policy think tank in Washington, DC.


In 2020, COVID-19 derailed the privacy debate

From biometric monitoring to unregulated contact tracing, the crisis opened up new privacy vulnerabilities that regulators did little to address.

Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, says the COVID-19 pandemic has become a "cash grab" for surveillance tech companies.

Photo: Lianhao Qu/Unsplash

As the coronavirus began its inexorable spread across the United States last spring, Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation, worried the virus would bring with it another scourge: mass surveillance.

"A lot of really bad ideas were being advanced here in the U.S. and a lot of really bad ideas were being actually implemented in foreign countries," Schwartz said.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Latest Stories