yesEmily BirnbaumNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Politics

How the Democrats' police reform bill would regulate facial recognition

The legislation marks the most meaningful movement yet at the federal level to clamp down on facial recognition, but it doesn't get close to the all-out ban or moratorium that many activists want.

Facial recognition

Facial recognition systems, like this one shown at a 2017 conference in Washington, D.C., have become increasingly popular and controversial for police departments.

Photo: Saul Loeb/AFP via Getty Images

Responding to the frustration and fury unleashed by a Minneapolis police officer's killing of George Floyd, congressional Democrats on Monday released an ambitious law enforcement reform bill that would, among other major changes, impose restrictions on the use of facial recognition technology by police officers.

For the most part, the Justice in Policing Act of 2020 repackages legislation that has been introduced before, and its most far-reaching proposals may hit insurmountable roadblocks in the Senate. But the bill's release reflects the urgency of the outrage over racial injustice, and it marks the most meaningful movement yet at the federal level to clamp down on facial recognition, a technology prized by many companies but criticized for its potential to invade privacy and exacerbate racial disparities in enforcement.

The proposals are limited in scope, applying to facial recognition software in police body cameras, and do not get close to the all-out ban or moratorium that many activists have called for at the local, state and federal level over the past year.

The bill could put tech giants in a bind. Companies including Microsoft and Amazon, as well as powerful tech trade groups, have long railed against legislation that would keep facial recognition technology away from law enforcement. But those companies in recent days have thrown their support behind the Black Lives Matter movement and could risk accusations of hypocrisy if they lobby against what would amount to the biggest federal overhaul of U.S. law enforcement in decades.

IBM on Monday set itself apart by announcing it is getting out of the facial recognition business entirely, saying it does not condone technology used for "mass surveillance, racial profiling, violations of basic human rights and freedoms."

Facial recognition regulation is one small part of the 134-page bill released Monday, which aims to crack down on police brutality by banning chokeholds and no-knock warrants, creating a national registry for police misconduct, mandating racial-bias training, and restricting the transfer of military-grade equipment to local police departments, among other proposals.

Netchoice, a tech trade group that has called facial recognition a "boon" for law enforcement, declined to comment. "We're not engaging on that bill at all," a spokesperson told Protocol in an email.

Tech companies may not feel compelled to do so. Rashida Richardson, director of policy research at the ethics-focused AI Now Institute at New York University, called the proposed provisions "toothless," saying the legislation would create significant loopholes for police to use facial recognition in case of an "imminent threat" or a "serious crime," and does not bar the government from using facial recognition in many instances. "I don't see them putting resources into lobbying against this," Richardson said of tech companies.

The package introduced Monday includes two bills — the Federal Police Camera and Accountability Act and the Police CAMERA Act — that could pare back the use of facial recognition by law enforcement in some cases.

The Federal Police Camera and Accountability Act, originally introduced last year by Rep. Eleanor Holmes Norton, a Democrat representing Washington, D.C., would require all police officers to wear body cameras while they conduct searches and make arrests, an effort to improve oversight and accountability. But the legislation would prohibit officers from equipping those body cameras with facial recognition technology. "No camera or recording device authorized or required to be used under this part may employ facial recognition technology," the legislation reads.

Sections of the bill aim to ensure that police officers do not exploit body cameras to surveil citizens exercising First Amendment rights. "Body cameras shall not be used to gather intelligence information based on First Amendment protected speech, associations or religion, or to record activity that is unrelated to a response to a call for service or a law enforcement or investigative encounter between a law enforcement officer and a member of the public," one provision reads.

Many police departments already maintain policies against installing facial recognition software in body cameras. Axon, the country's leading maker of wearable cameras for cops, has said it will not yet install facial recognition technology in its cameras because of the ethical concerns. However, police departments are increasingly using and deploying facial recognition in their routine work, and if passed, the legislation would force some companies and law enforcement to alter their course.

The bill would further prevent officers from subjecting video footage to facial recognition or other forms of automated analysis without a warrant.

The Police CAMERA Act would issue government grants for police departments to buy or lease body-worn cameras, as long as they issue guidelines around protecting the "constitutional rights" of individuals who are scanned by facial recognition software. The provision would allow the use of facial recognition in instances when the officer has a warrant or there are "imminent threats or serious crimes."

The legislation reflects a balancing between providing more oversight of police activity through the use of technology and protecting citizens' privacy. It underlines the heated public debate around facial recognition technology, which has increasingly animated state legislatures across the country over the past year.

Cities including San Francisco, Oakland, Berkeley and Somerville, Massachusetts, have passed local ordinances banning facial recognition outright. California lawmakers last week blocked controversial legislation that would have provided some safeguards around facial recognition. In Washington state, Microsoft publicly backed a lighter-touch facial recognition bill that was signed into law in March amid protests from activists including the ACLU, who wanted to put a temporary moratorium on government uses of facial recognition.

The technology, which scans people's faces and looks for matches in databases, has swept across the country in recent years with little government regulation or oversight. There is no federal law establishing rules for face-scanning, despite efforts by Congress to develop one last year. Meanwhile, according to market research firm Grand View Research, the government "facial biometrics" market is expected to surge from $136.9 million in 2018 to $375 million by 2025.

Critics of the government's use of the technology, including privacy and civil rights activists, warn that it can deliver authorities unprecedented access to people's movements, particularly heavily policed minority communities. Studies have shown that some facial recognition technology is more likely to misidentify women and people of color. A government study in December 2019 found that Asian and African American people were up to 100 times more likely to be misidentified than white men.

Concerns around facial recognition technology and law enforcement were heightened early this year by revelations about Clearview AI, which built a database of billions of photos by scraping social media and sold its product to many police departments across the country.

Richardson, of the AI Now Institute, said the legislation does not acknowledge a longtime, central drawback of facial recognition technology: It often doesn't work. A report by the National Institute of Standards and Technology found that, despite improvements in accuracy among the leading facial recognition vendors, most algorithms are "not close" to achieving perfect identification.

"When so much of the pushback on the use of facial recognition is because of the fact that it doesn't work, and it disproportionately doesn't work for communities that are already marginalized in many ways by the criminal justice system, that seems like a glaring omission," she said.

Last year, the House Oversight Committee held a string of hearings on facial recognition technology, which featured lawmakers from both parties pledging that they would imminently introduce legislation to regulate the industry. But the effort lost steam when the committee's chairman and a major champion of the issue, Democratic Rep. Elijah Cummings, died in October, and it's since fallen far below other legislative priorities, including coronavirus relief.

The Democrats' police reform bill likely won't make it to the Senate as written, and it's unclear if the facial recognition provisions have any legislative pathway. So far, it seems most likely that cities and states will continue to lead the way in regulating the sensitive technology.

Big Tech benefits from Biden’s sweeping immigration actions

Tim Cook and Sundar Pichai praised President Biden's immigration actions, which read like a tech industry wishlist.

Newly-inaugurated President Joe Biden signed two immigration-related executive orders on Wednesday.

Photo: Chip Somodevilla/Getty Images

Immediately after being sworn in as president Wednesday, Joe Biden signed two pro-immigration executive orders and delivered an immigration bill to Congress that reads like a tech industry wishlist. The move drew enthusiastic praise from tech leaders, including Apple CEO Tim Cook and Alphabet CEO Sundar Pichai.

President Biden nullified several of former-President Trump's most hawkish immigration policies. His executive orders reversed the so-called "Muslim ban" and instructed the attorney general and the secretary of Homeland Security to preserve the Deferred Action for Childhood Arrivals, or DACA, program, which the Trump administration had sought to end. He also sent an expansive immigration reform bill to Congress that would provide a pathway to citizenship for undocumented individuals and make it easier for foreign U.S. graduates with STEM degrees to stay in the United States, among other provisions.

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

People

Amazon’s head of Alexa Trust on how Big Tech should talk about data

Anne Toth, Amazon's director of Alexa Trust, explains what it takes to get people to feel comfortable using your product — and why that is work worth doing.

Anne Toth, Amazon's director of Alexa Trust, has been working on tech privacy for decades.

Photo: Amazon

Anne Toth has had a long career in the tech industry, thinking about privacy and security at companies like Yahoo, Google and Slack, working with the World Economic Forum and advising companies around Silicon Valley.

Last August she took on a new job as the director of Alexa Trust, leading a big team tackling a big question: How do you make people feel good using a product like Alexa, which is designed to be deeply ingrained in their lives? "Alexa in your home is probably the closest sort of consumer experience or manifestation of AI in your life," she said. That comes with data questions, privacy questions, ethical questions and lots more.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

The current state-of-the-art quantum computers are a tangle of wires. And that can't be the case in the future.

Photo: IBM Research

The iconic image of quantum computing is the "Google chandelier," with its hundreds of intricately arranged copper wires descending like the tendrils of a metallic jellyfish. It's a grand and impressive device, but in that tangle of wires lurks a big problem.

"If you're thinking about the long-term prospects of quantum computing, that image should be just terrifying," Jim Clarke, the director of quantum hardware at Intel, told Protocol.

Keep Reading Show less
Dan Garisto
Dan Garisto is a freelance science journalist who specializes in the physical sciences, with an emphasis on particle physics. He has an undergraduate degree in physics and is based in New York.
Politics

Here’s how Big Tech is preparing for regulations in 2021

Companies know that the heat is only going to increase this year.

2021 promises to be a turbulent year for Big Tech.

Photo: Ting Shen/Getty Images

The open internet. Section 230. China. Internet access. 5G. Antitrust. When we asked the policy shops at some of the biggest and most powerful tech companies to identify their 2021 policy priorities, these were the words they had in common.

Each of these issues centers around a common theme. "Despite how tech companies might feel, they've been enjoying a very high innovation phase. They're about to experience a strong regulation phase," said Erika Fisher, Atlassian's general counsel and chief administrative officer. "The question is not if, but how that regulation will be shaped."

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (@ anna_c_kramer), where she helps write and produce Source Code, Protocol's daily newsletter. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Politics

In 2020, COVID-19 derailed the privacy debate

From biometric monitoring to unregulated contact tracing, the crisis opened up new privacy vulnerabilities that regulators did little to address.

Albert Fox Cahn, executive director of the Surveillance Technology Oversight Project, says the COVID-19 pandemic has become a "cash grab" for surveillance tech companies.

Photo: Lianhao Qu/Unsplash

As the coronavirus began its inexorable spread across the United States last spring, Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation, worried the virus would bring with it another scourge: mass surveillance.

"A lot of really bad ideas were being advanced here in the U.S. and a lot of really bad ideas were being actually implemented in foreign countries," Schwartz said.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Latest Stories