Politics

How the Democrats' police reform bill would regulate facial recognition

The legislation marks the most meaningful movement yet at the federal level to clamp down on facial recognition, but it doesn't get close to the all-out ban or moratorium that many activists want.

Facial recognition

Facial recognition systems, like this one shown at a 2017 conference in Washington, D.C., have become increasingly popular and controversial for police departments.

Photo: Saul Loeb/AFP via Getty Images

Responding to the frustration and fury unleashed by a Minneapolis police officer's killing of George Floyd, congressional Democrats on Monday released an ambitious law enforcement reform bill that would, among other major changes, impose restrictions on the use of facial recognition technology by police officers.

For the most part, the Justice in Policing Act of 2020 repackages legislation that has been introduced before, and its most far-reaching proposals may hit insurmountable roadblocks in the Senate. But the bill's release reflects the urgency of the outrage over racial injustice, and it marks the most meaningful movement yet at the federal level to clamp down on facial recognition, a technology prized by many companies but criticized for its potential to invade privacy and exacerbate racial disparities in enforcement.

The proposals are limited in scope, applying to facial recognition software in police body cameras, and do not get close to the all-out ban or moratorium that many activists have called for at the local, state and federal level over the past year.

The bill could put tech giants in a bind. Companies including Microsoft and Amazon, as well as powerful tech trade groups, have long railed against legislation that would keep facial recognition technology away from law enforcement. But those companies in recent days have thrown their support behind the Black Lives Matter movement and could risk accusations of hypocrisy if they lobby against what would amount to the biggest federal overhaul of U.S. law enforcement in decades.

IBM on Monday set itself apart by announcing it is getting out of the facial recognition business entirely, saying it does not condone technology used for "mass surveillance, racial profiling, violations of basic human rights and freedoms."

Facial recognition regulation is one small part of the 134-page bill released Monday, which aims to crack down on police brutality by banning chokeholds and no-knock warrants, creating a national registry for police misconduct, mandating racial-bias training, and restricting the transfer of military-grade equipment to local police departments, among other proposals.

Netchoice, a tech trade group that has called facial recognition a "boon" for law enforcement, declined to comment. "We're not engaging on that bill at all," a spokesperson told Protocol in an email.

Tech companies may not feel compelled to do so. Rashida Richardson, director of policy research at the ethics-focused AI Now Institute at New York University, called the proposed provisions "toothless," saying the legislation would create significant loopholes for police to use facial recognition in case of an "imminent threat" or a "serious crime," and does not bar the government from using facial recognition in many instances. "I don't see them putting resources into lobbying against this," Richardson said of tech companies.

The package introduced Monday includes two bills — the Federal Police Camera and Accountability Act and the Police CAMERA Act — that could pare back the use of facial recognition by law enforcement in some cases.

The Federal Police Camera and Accountability Act, originally introduced last year by Rep. Eleanor Holmes Norton, a Democrat representing Washington, D.C., would require all police officers to wear body cameras while they conduct searches and make arrests, an effort to improve oversight and accountability. But the legislation would prohibit officers from equipping those body cameras with facial recognition technology. "No camera or recording device authorized or required to be used under this part may employ facial recognition technology," the legislation reads.

Sections of the bill aim to ensure that police officers do not exploit body cameras to surveil citizens exercising First Amendment rights. "Body cameras shall not be used to gather intelligence information based on First Amendment protected speech, associations or religion, or to record activity that is unrelated to a response to a call for service or a law enforcement or investigative encounter between a law enforcement officer and a member of the public," one provision reads.

Many police departments already maintain policies against installing facial recognition software in body cameras. Axon, the country's leading maker of wearable cameras for cops, has said it will not yet install facial recognition technology in its cameras because of the ethical concerns. However, police departments are increasingly using and deploying facial recognition in their routine work, and if passed, the legislation would force some companies and law enforcement to alter their course.

The bill would further prevent officers from subjecting video footage to facial recognition or other forms of automated analysis without a warrant.

The Police CAMERA Act would issue government grants for police departments to buy or lease body-worn cameras, as long as they issue guidelines around protecting the "constitutional rights" of individuals who are scanned by facial recognition software. The provision would allow the use of facial recognition in instances when the officer has a warrant or there are "imminent threats or serious crimes."

The legislation reflects a balancing between providing more oversight of police activity through the use of technology and protecting citizens' privacy. It underlines the heated public debate around facial recognition technology, which has increasingly animated state legislatures across the country over the past year.

Cities including San Francisco, Oakland, Berkeley and Somerville, Massachusetts, have passed local ordinances banning facial recognition outright. California lawmakers last week blocked controversial legislation that would have provided some safeguards around facial recognition. In Washington state, Microsoft publicly backed a lighter-touch facial recognition bill that was signed into law in March amid protests from activists including the ACLU, who wanted to put a temporary moratorium on government uses of facial recognition.

The technology, which scans people's faces and looks for matches in databases, has swept across the country in recent years with little government regulation or oversight. There is no federal law establishing rules for face-scanning, despite efforts by Congress to develop one last year. Meanwhile, according to market research firm Grand View Research, the government "facial biometrics" market is expected to surge from $136.9 million in 2018 to $375 million by 2025.

Critics of the government's use of the technology, including privacy and civil rights activists, warn that it can deliver authorities unprecedented access to people's movements, particularly heavily policed minority communities. Studies have shown that some facial recognition technology is more likely to misidentify women and people of color. A government study in December 2019 found that Asian and African American people were up to 100 times more likely to be misidentified than white men.

Concerns around facial recognition technology and law enforcement were heightened early this year by revelations about Clearview AI, which built a database of billions of photos by scraping social media and sold its product to many police departments across the country.

Richardson, of the AI Now Institute, said the legislation does not acknowledge a longtime, central drawback of facial recognition technology: It often doesn't work. A report by the National Institute of Standards and Technology found that, despite improvements in accuracy among the leading facial recognition vendors, most algorithms are "not close" to achieving perfect identification.

"When so much of the pushback on the use of facial recognition is because of the fact that it doesn't work, and it disproportionately doesn't work for communities that are already marginalized in many ways by the criminal justice system, that seems like a glaring omission," she said.

Last year, the House Oversight Committee held a string of hearings on facial recognition technology, which featured lawmakers from both parties pledging that they would imminently introduce legislation to regulate the industry. But the effort lost steam when the committee's chairman and a major champion of the issue, Democratic Rep. Elijah Cummings, died in October, and it's since fallen far below other legislative priorities, including coronavirus relief.

The Democrats' police reform bill likely won't make it to the Senate as written, and it's unclear if the facial recognition provisions have any legislative pathway. So far, it seems most likely that cities and states will continue to lead the way in regulating the sensitive technology.

Protocol | Workplace

Performance reviews suck. Here's how to fix them.

Slack integrations and keywords and AI, oh my!

Time will tell how smart HR technology has the potential to be, or how smart users want it to be.

Image: Christopher T. Fong/Protocol

Arguably nothing elicits more of a collective groan at work than performance review season. Managers hate giving them. Employees theoretically want them, but dread receiving them. It's as clear how much time and effort they take as it is unclear how useful formal performance reviews actually are in measuring and evaluating performance.

It's an arena ripe for disruption.

Keep Reading Show less
Michelle Ma
Michelle Ma (@himichellema) is a reporter at Protocol, where she writes about management, leadership and workplace issues in tech. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.


Keep Reading Show less
Nasdaq
A technology company reimagining global capital markets and economies.
Protocol | Workplace

This startup will fire unvaxxed workers. Big Tech won’t say the same.

In an industry built for remote work, will companies fire workers who refuse to get vaccinated?

Several big tech companies stopped short of saying whether they would fire workers for not getting vaccinated.

Illustration: simplehappyart via Getty Images

As employers wait for the Department of Labor to issue a new rule requiring employee vaccine mandates, a big question looms: Will companies fire workers who don't comply?

Many of the tech giants won't say. A couple of companies have confirmed that they won't: Both Hewlett Packard Enterprise and Pure Storage said vaccination is not a condition of employment, though it's required to come to the office.

Keep Reading Show less
Allison Levitsky
Allison Levitsky is a reporter at Protocol covering workplace issues in tech. She previously covered big tech companies and the tech workforce for the Silicon Valley Business Journal. Allison grew up in the Bay Area and graduated from UC Berkeley.

With Andrew Bosworth, Facebook just appointed a metaverse CTO

The AR/VR executive isn't just putting a focus on Facebook's hardware efforts, but on a future without the big blue app.

Andrew Bosworth has led Facebook's hardware efforts. As the company's CTO, he's expected to put a major focus on the metaverse.

Photo: Christian Charisius/Getty Images

Facebook is getting ready for the metaverse: The company's decision to replace outgoing CTO Mike "Schrep" Schroepfer with hardware SVP Andrew "Boz" Bosworth is not only a signal that the company is committed to AR and VR for years to come; it also shows that Facebook execs see the metaverse as a foundational technology, with the potential to eventually replace current cash cows like the company's core "big blue" Facebook app.

Bosworth has been with Facebook since 2006 and is among Mark Zuckerberg's closest allies, but he's arguably gotten the most attention for leading the company's AR/VR and consumer hardware efforts.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Protocol | Fintech

Here’s everything going wrong at Binance

Binance trades far more crypto than rivals like Coinbase and FTX. Its regulatory challenges and legal issues in the U.S., EU and China loom just as large.

Binance CEO Changpeng Zhao is overseeing a global crypto empire with global problems.

Photo: Akio Kon/Bloomberg via Getty Images

Binance, the largest global crypto exchange, has been hit by a raft of regulatory challenges worldwide that only seem to increase.

It's the biggest example of what worries regulators in crypto: unfettered investor access to a range of digital tokens finance officials have never heard of, without the traditional investor protections of regulated markets.

Keep Reading Show less
Tomio Geron

Tomio Geron ( @tomiogeron) is a San Francisco-based reporter covering fintech. He was previously a reporter and editor at The Wall Street Journal, covering venture capital and startups. Before that, he worked as a staff writer at Forbes, covering social media and venture capital, and also edited the Midas List of top tech investors. He has also worked at newspapers covering crime, courts, health and other topics. He can be reached at tgeron@protocol.com or tgeron@protonmail.com.

Latest Stories