Politics

How the Democrats' police reform bill would regulate facial recognition

The legislation marks the most meaningful movement yet at the federal level to clamp down on facial recognition, but it doesn't get close to the all-out ban or moratorium that many activists want.

Facial recognition

Facial recognition systems, like this one shown at a 2017 conference in Washington, D.C., have become increasingly popular and controversial for police departments.

Photo: Saul Loeb/AFP via Getty Images

Responding to the frustration and fury unleashed by a Minneapolis police officer's killing of George Floyd, congressional Democrats on Monday released an ambitious law enforcement reform bill that would, among other major changes, impose restrictions on the use of facial recognition technology by police officers.

For the most part, the Justice in Policing Act of 2020 repackages legislation that has been introduced before, and its most far-reaching proposals may hit insurmountable roadblocks in the Senate. But the bill's release reflects the urgency of the outrage over racial injustice, and it marks the most meaningful movement yet at the federal level to clamp down on facial recognition, a technology prized by many companies but criticized for its potential to invade privacy and exacerbate racial disparities in enforcement.

The proposals are limited in scope, applying to facial recognition software in police body cameras, and do not get close to the all-out ban or moratorium that many activists have called for at the local, state and federal level over the past year.

The bill could put tech giants in a bind. Companies including Microsoft and Amazon, as well as powerful tech trade groups, have long railed against legislation that would keep facial recognition technology away from law enforcement. But those companies in recent days have thrown their support behind the Black Lives Matter movement and could risk accusations of hypocrisy if they lobby against what would amount to the biggest federal overhaul of U.S. law enforcement in decades.

IBM on Monday set itself apart by announcing it is getting out of the facial recognition business entirely, saying it does not condone technology used for "mass surveillance, racial profiling, violations of basic human rights and freedoms."

Facial recognition regulation is one small part of the 134-page bill released Monday, which aims to crack down on police brutality by banning chokeholds and no-knock warrants, creating a national registry for police misconduct, mandating racial-bias training, and restricting the transfer of military-grade equipment to local police departments, among other proposals.

Netchoice, a tech trade group that has called facial recognition a "boon" for law enforcement, declined to comment. "We're not engaging on that bill at all," a spokesperson told Protocol in an email.

Tech companies may not feel compelled to do so. Rashida Richardson, director of policy research at the ethics-focused AI Now Institute at New York University, called the proposed provisions "toothless," saying the legislation would create significant loopholes for police to use facial recognition in case of an "imminent threat" or a "serious crime," and does not bar the government from using facial recognition in many instances. "I don't see them putting resources into lobbying against this," Richardson said of tech companies.

The package introduced Monday includes two bills — the Federal Police Camera and Accountability Act and the Police CAMERA Act — that could pare back the use of facial recognition by law enforcement in some cases.

The Federal Police Camera and Accountability Act, originally introduced last year by Rep. Eleanor Holmes Norton, a Democrat representing Washington, D.C., would require all police officers to wear body cameras while they conduct searches and make arrests, an effort to improve oversight and accountability. But the legislation would prohibit officers from equipping those body cameras with facial recognition technology. "No camera or recording device authorized or required to be used under this part may employ facial recognition technology," the legislation reads.

Sections of the bill aim to ensure that police officers do not exploit body cameras to surveil citizens exercising First Amendment rights. "Body cameras shall not be used to gather intelligence information based on First Amendment protected speech, associations or religion, or to record activity that is unrelated to a response to a call for service or a law enforcement or investigative encounter between a law enforcement officer and a member of the public," one provision reads.

Many police departments already maintain policies against installing facial recognition software in body cameras. Axon, the country's leading maker of wearable cameras for cops, has said it will not yet install facial recognition technology in its cameras because of the ethical concerns. However, police departments are increasingly using and deploying facial recognition in their routine work, and if passed, the legislation would force some companies and law enforcement to alter their course.

The bill would further prevent officers from subjecting video footage to facial recognition or other forms of automated analysis without a warrant.

The Police CAMERA Act would issue government grants for police departments to buy or lease body-worn cameras, as long as they issue guidelines around protecting the "constitutional rights" of individuals who are scanned by facial recognition software. The provision would allow the use of facial recognition in instances when the officer has a warrant or there are "imminent threats or serious crimes."

The legislation reflects a balancing between providing more oversight of police activity through the use of technology and protecting citizens' privacy. It underlines the heated public debate around facial recognition technology, which has increasingly animated state legislatures across the country over the past year.

Cities including San Francisco, Oakland, Berkeley and Somerville, Massachusetts, have passed local ordinances banning facial recognition outright. California lawmakers last week blocked controversial legislation that would have provided some safeguards around facial recognition. In Washington state, Microsoft publicly backed a lighter-touch facial recognition bill that was signed into law in March amid protests from activists including the ACLU, who wanted to put a temporary moratorium on government uses of facial recognition.

The technology, which scans people's faces and looks for matches in databases, has swept across the country in recent years with little government regulation or oversight. There is no federal law establishing rules for face-scanning, despite efforts by Congress to develop one last year. Meanwhile, according to market research firm Grand View Research, the government "facial biometrics" market is expected to surge from $136.9 million in 2018 to $375 million by 2025.

Critics of the government's use of the technology, including privacy and civil rights activists, warn that it can deliver authorities unprecedented access to people's movements, particularly heavily policed minority communities. Studies have shown that some facial recognition technology is more likely to misidentify women and people of color. A government study in December 2019 found that Asian and African American people were up to 100 times more likely to be misidentified than white men.

Concerns around facial recognition technology and law enforcement were heightened early this year by revelations about Clearview AI, which built a database of billions of photos by scraping social media and sold its product to many police departments across the country.

Richardson, of the AI Now Institute, said the legislation does not acknowledge a longtime, central drawback of facial recognition technology: It often doesn't work. A report by the National Institute of Standards and Technology found that, despite improvements in accuracy among the leading facial recognition vendors, most algorithms are "not close" to achieving perfect identification.

"When so much of the pushback on the use of facial recognition is because of the fact that it doesn't work, and it disproportionately doesn't work for communities that are already marginalized in many ways by the criminal justice system, that seems like a glaring omission," she said.

Last year, the House Oversight Committee held a string of hearings on facial recognition technology, which featured lawmakers from both parties pledging that they would imminently introduce legislation to regulate the industry. But the effort lost steam when the committee's chairman and a major champion of the issue, Democratic Rep. Elijah Cummings, died in October, and it's since fallen far below other legislative priorities, including coronavirus relief.

The Democrats' police reform bill likely won't make it to the Senate as written, and it's unclear if the facial recognition provisions have any legislative pathway. So far, it seems most likely that cities and states will continue to lead the way in regulating the sensitive technology.

Enterprise

Why it’s time to ban algorithmic recommendations for children

How do we encourage the good that ML and AI can provide while restraining potential harms?

Algorithms often harm the very users they are supposed to serve.

Photo: Alfonso Di Vincenzo/KONTROLAB/LightRocket via Getty Images

Tom Siegel is the CEO and co-founder of Trust Lab.

In the era of social media, AI algorithms have the power to decide everything from our playlists to the videos we watch, the news we consume and what special shopping deals we’re offered, and which are withheld. For all the good machine-learning technologies and algorithms do to improve and personalize the online experience for all of us, they also present one of the biggest threats for online safety, with real-world negative implications for the health and well-being of all internet users.

Keep Reading Show less
Tom Siegel
Tom Siegel is the CEO and Co-Founder of Trust Lab. Previously the VP of Trust & Safety at Google for 14 years, Tom built its global team through all stages of growth into an industry-leading user protection and abuse fighting organization with thousands of team members globally.

Sustainability. It can be a charged word in the context of blockchain and crypto – whether from outsiders with a limited view of the technology or from insiders using it for competitive advantage. But as a CEO in the industry, I don’t think either of those approaches helps us move forward. We should all be able to agree that using less energy to get a task done is a good thing and that there is room for improvement in the amount of energy that is consumed to power different blockchain technologies.

So, what if we put the enormous industry talent and minds that have created and developed blockchain to the task of building in a more energy-efficient manner? Can we not just solve the issues but also set the standard for other industries to develop technology in a future-proof way?

Keep Reading Show less
Denelle Dixon, CEO of SDF

Denelle Dixon is CEO and Executive Director of the Stellar Development Foundation, a non-profit using blockchain to unlock economic potential by making money more fluid, markets more open, and people more empowered. Previously, Dixon served as COO of Mozilla. Leading the business, revenue and policy teams, she fought for Net Neutrality and consumer privacy protections and was responsible for commercial partnerships. Denelle also served as general counsel and legal advisor in private equity and technology.

Policy

Google is wooing a coalition of civil rights allies. It’s working.

The tech giant is adept at winning friends even when it’s not trying to immediately influence people.

A map display of Washington lines the floor next to the elevators at the Google office in Washington, D.C.

Photo: Andrew Harrer/Bloomberg via Getty Images

As Google has faced intensifying pressure from policymakers in recent years, it’s founded trade associations, hired a roster of former top government officials and sometimes spent more than $20 million annually on federal lobbying.

But the company has also become famous in Washington for nurturing less clearly mercenary ties. It has long funded the work of laissez-faire economists who now defend it against antitrust charges, for instance. It’s making inroads with traditional business associations that once pummeled it on policy, and also supports think tanks and advocacy groups.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Workplace

Everything you need to know about tech layoffs and hiring slowdowns

Will tech companies and startups continue to have layoffs?

It’s not just early-stage startups that are feeling the burn.

Photo: Kirsty O'Connor/PA Images via Getty Images

What goes up must come down.

High-flying startups with record valuations, huge hiring goals and ambitious expansion plans are now announcing hiring slowdowns, freezes and in some cases widespread layoffs. It’s the dot-com bust all over again — this time, without the cute sock puppet and in the midst of a global pandemic we just can’t seem to shake.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Entertainment

Sink into ‘Love, Death & Robots’ and more weekend recs

Don’t know what to do this weekend? We’ve got you covered.

Our favorite picks for your weekend pleasure.

Image: A24; 11 bit studios; Getty Images

We could all use a bit of a break. This weekend we’re diving into Netflix’s beautifully animated sci-fi “Love, Death & Robots,” losing ourselves in surreal “Men” and loving Zelda-like Moonlighter.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Latest Stories
Bulletins