Politics

How the Democrats' police reform bill would regulate facial recognition

The legislation marks the most meaningful movement yet at the federal level to clamp down on facial recognition, but it doesn't get close to the all-out ban or moratorium that many activists want.

Facial recognition

Facial recognition systems, like this one shown at a 2017 conference in Washington, D.C., have become increasingly popular and controversial for police departments.

Photo: Saul Loeb/AFP via Getty Images

Responding to the frustration and fury unleashed by a Minneapolis police officer's killing of George Floyd, congressional Democrats on Monday released an ambitious law enforcement reform bill that would, among other major changes, impose restrictions on the use of facial recognition technology by police officers.

For the most part, the Justice in Policing Act of 2020 repackages legislation that has been introduced before, and its most far-reaching proposals may hit insurmountable roadblocks in the Senate. But the bill's release reflects the urgency of the outrage over racial injustice, and it marks the most meaningful movement yet at the federal level to clamp down on facial recognition, a technology prized by many companies but criticized for its potential to invade privacy and exacerbate racial disparities in enforcement.

The proposals are limited in scope, applying to facial recognition software in police body cameras, and do not get close to the all-out ban or moratorium that many activists have called for at the local, state and federal level over the past year.

The bill could put tech giants in a bind. Companies including Microsoft and Amazon, as well as powerful tech trade groups, have long railed against legislation that would keep facial recognition technology away from law enforcement. But those companies in recent days have thrown their support behind the Black Lives Matter movement and could risk accusations of hypocrisy if they lobby against what would amount to the biggest federal overhaul of U.S. law enforcement in decades.

IBM on Monday set itself apart by announcing it is getting out of the facial recognition business entirely, saying it does not condone technology used for "mass surveillance, racial profiling, violations of basic human rights and freedoms."

Facial recognition regulation is one small part of the 134-page bill released Monday, which aims to crack down on police brutality by banning chokeholds and no-knock warrants, creating a national registry for police misconduct, mandating racial-bias training, and restricting the transfer of military-grade equipment to local police departments, among other proposals.

Netchoice, a tech trade group that has called facial recognition a "boon" for law enforcement, declined to comment. "We're not engaging on that bill at all," a spokesperson told Protocol in an email.

Tech companies may not feel compelled to do so. Rashida Richardson, director of policy research at the ethics-focused AI Now Institute at New York University, called the proposed provisions "toothless," saying the legislation would create significant loopholes for police to use facial recognition in case of an "imminent threat" or a "serious crime," and does not bar the government from using facial recognition in many instances. "I don't see them putting resources into lobbying against this," Richardson said of tech companies.

The package introduced Monday includes two bills — the Federal Police Camera and Accountability Act and the Police CAMERA Act — that could pare back the use of facial recognition by law enforcement in some cases.

The Federal Police Camera and Accountability Act, originally introduced last year by Rep. Eleanor Holmes Norton, a Democrat representing Washington, D.C., would require all police officers to wear body cameras while they conduct searches and make arrests, an effort to improve oversight and accountability. But the legislation would prohibit officers from equipping those body cameras with facial recognition technology. "No camera or recording device authorized or required to be used under this part may employ facial recognition technology," the legislation reads.

Sections of the bill aim to ensure that police officers do not exploit body cameras to surveil citizens exercising First Amendment rights. "Body cameras shall not be used to gather intelligence information based on First Amendment protected speech, associations or religion, or to record activity that is unrelated to a response to a call for service or a law enforcement or investigative encounter between a law enforcement officer and a member of the public," one provision reads.

Many police departments already maintain policies against installing facial recognition software in body cameras. Axon, the country's leading maker of wearable cameras for cops, has said it will not yet install facial recognition technology in its cameras because of the ethical concerns. However, police departments are increasingly using and deploying facial recognition in their routine work, and if passed, the legislation would force some companies and law enforcement to alter their course.

The bill would further prevent officers from subjecting video footage to facial recognition or other forms of automated analysis without a warrant.

The Police CAMERA Act would issue government grants for police departments to buy or lease body-worn cameras, as long as they issue guidelines around protecting the "constitutional rights" of individuals who are scanned by facial recognition software. The provision would allow the use of facial recognition in instances when the officer has a warrant or there are "imminent threats or serious crimes."

The legislation reflects a balancing between providing more oversight of police activity through the use of technology and protecting citizens' privacy. It underlines the heated public debate around facial recognition technology, which has increasingly animated state legislatures across the country over the past year.

Cities including San Francisco, Oakland, Berkeley and Somerville, Massachusetts, have passed local ordinances banning facial recognition outright. California lawmakers last week blocked controversial legislation that would have provided some safeguards around facial recognition. In Washington state, Microsoft publicly backed a lighter-touch facial recognition bill that was signed into law in March amid protests from activists including the ACLU, who wanted to put a temporary moratorium on government uses of facial recognition.

The technology, which scans people's faces and looks for matches in databases, has swept across the country in recent years with little government regulation or oversight. There is no federal law establishing rules for face-scanning, despite efforts by Congress to develop one last year. Meanwhile, according to market research firm Grand View Research, the government "facial biometrics" market is expected to surge from $136.9 million in 2018 to $375 million by 2025.

Critics of the government's use of the technology, including privacy and civil rights activists, warn that it can deliver authorities unprecedented access to people's movements, particularly heavily policed minority communities. Studies have shown that some facial recognition technology is more likely to misidentify women and people of color. A government study in December 2019 found that Asian and African American people were up to 100 times more likely to be misidentified than white men.

Concerns around facial recognition technology and law enforcement were heightened early this year by revelations about Clearview AI, which built a database of billions of photos by scraping social media and sold its product to many police departments across the country.

Richardson, of the AI Now Institute, said the legislation does not acknowledge a longtime, central drawback of facial recognition technology: It often doesn't work. A report by the National Institute of Standards and Technology found that, despite improvements in accuracy among the leading facial recognition vendors, most algorithms are "not close" to achieving perfect identification.

"When so much of the pushback on the use of facial recognition is because of the fact that it doesn't work, and it disproportionately doesn't work for communities that are already marginalized in many ways by the criminal justice system, that seems like a glaring omission," she said.

Last year, the House Oversight Committee held a string of hearings on facial recognition technology, which featured lawmakers from both parties pledging that they would imminently introduce legislation to regulate the industry. But the effort lost steam when the committee's chairman and a major champion of the issue, Democratic Rep. Elijah Cummings, died in October, and it's since fallen far below other legislative priorities, including coronavirus relief.

The Democrats' police reform bill likely won't make it to the Senate as written, and it's unclear if the facial recognition provisions have any legislative pathway. So far, it seems most likely that cities and states will continue to lead the way in regulating the sensitive technology.

Enterprise

US issues sweeping new rules on chip-tech exports to China

The Biden administration rolled out new, wide-ranging export controls on the chips and equipment U.S. companies are able to sell to China.

The Biden administration’s new controls on chip exports represent a significant shift in U.S. policy related to China.

Photo: Chen Zhonghao/Xinhua via Getty Images

The U.S. unveiled a set of new regulations Friday that aim to choke off China’s access to advanced chips, the tools necessary to manufacture years-old designs, and the service and support mechanisms needed to keep chip fabrication systems running smoothly.

On a briefing call with reporters Thursday, administration officials said the goal is to block the People’s Liberation Army and China’s domestic surveillance apparatus from gaining access to advanced computing capabilities that require the use of advanced semiconductors. The chips, tools, and software are helping China’s military, including aiding the development of weapons of mass destruction, according to the officials, who asked to remain anonymous to discuss the administration’s policies freely.

Keep Reading Show less
Max A. Cherney

Max A. Cherney is a senior reporter at Protocol covering the semiconductor industry. He has worked for Barron's magazine as a Technology Reporter, and its sister site MarketWatch. He is based in San Francisco.

Sponsored Content

Great products are built on strong patents

Experts say robust intellectual property protection is essential to ensure the long-term R&D required to innovate and maintain America's technology leadership.

Every great tech product that you rely on each day, from the smartphone in your pocket to your music streaming service and navigational system in the car, shares one important thing: part of its innovative design is protected by intellectual property (IP) laws.

From 5G to artificial intelligence, IP protection offers a powerful incentive for researchers to create ground-breaking products, and governmental leaders say its protection is an essential part of maintaining US technology leadership. To quote Secretary of Commerce Gina Raimondo: "intellectual property protection is vital for American innovation and entrepreneurship.”

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.
Enterprise

Why CrowdStrike wants to be a broader enterprise IT player

The company, which grew from $1 billion in annual recurring revenue to $2 billion in just 18 months, is expanding deeper within the cybersecurity market and into the wider IT space as well.

CrowdStrike is well positioned at a time when CISOs are fed up with going to dozens of different vendors to meet their security needs.

Image: Protocol

CrowdStrike is finding massive traction in areas outside its core endpoint security products, setting up the company to become a major player in other key security segments such as identity protection as well as in IT categories beyond cybersecurity.

Already one of the biggest names in cybersecurity for the past decade, CrowdStrike now aspires to become a more important player in areas within the wider IT landscape such as data observability and IT operations, CrowdStrike co-founder and CEO George Kurtz told Protocol in a recent interview.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.

Fintech

Election markets are far from a sure bet

Kalshi has big-name backing for its plan to offer futures contracts tied to election results. Will that win over a long-skeptical regulator?

Whether Kalshi’s election contracts could be considered gaming or whether they serve a true risk-hedging purpose is one of the top questions the CFTC is weighing in its review.

Photo illustration: Getty Images; Protocol

Crypto isn’t the only emerging issue on the CFTC’s plate. The futures regulator is also weighing a fintech sector that has similarly tricky political implications: election bets.

The Commodity Futures Trading Commission has set Oct. 28 as a date by which it hopes to decide whether the New York-based startup Kalshi can offer a form of wagering up to $25,000 on which party will control the House of Representatives and Senate after the midterms. PredictIt, another online market for election trading, has also sued the regulator over its decision to cancel a no-action letter.

Keep Reading Show less
Ryan Deffenbaugh
Ryan Deffenbaugh is a reporter at Protocol focused on fintech. Before joining Protocol, he reported on New York's technology industry for Crain's New York Business. He is based in New York and can be reached at rdeffenbaugh@protocol.com.
Enterprise

The Uber verdict shows why mandatory disclosure isn't such a bad idea

The conviction of Uber's former chief security officer, Joe Sullivan, seems likely to change some minds in the debate over proposed cyber incident reporting regulations.

Executives and boards will now be "a whole lot less likely to cover things up," said one information security veteran.

Photo: Al Drago/Bloomberg via Getty Images

If nothing else, the guilty verdict delivered Wednesday in a case involving Uber's former security head will have this effect on how breaches are handled in the future: Executives and boards, according to information security veteran Michael Hamilton, will be "a whole lot less likely to cover things up."

Following the conviction of former Uber chief security officer Joe Sullivan, "we likely will get better voluntary reporting" of cyber incidents, said Hamilton, formerly the chief information security officer of the City of Seattle, and currently the founder and CISO at cybersecurity vendor Critical Insight.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.

Latest Stories
Bulletins