People

IBM exits facial recognition, but that doesn’t mean it’s cutting ties with police

The tech giant still sells AI tools to police departments that experts say can entrench racial bias in law enforcement.

People at a Black Lives Matter protest

For all of IBM's recommendations — including investing in education and holding police accountable for misconduct — the company stopped short of cutting its many other ties with police departments.

Photo: Stanton Sharpe/SOPA Images/LightRocket via Getty Images

In a letter to Congress this week, IBM's CEO Arvind Krishna said the company is no longer offering facial recognition tools and called on Congress to regulate the use of the controversial technology by police, as part of a list of proposals to tackle racial injustice in America.

The news, which came amid widespread protests against police brutality, earned IBM praise and calls for other tech giants, including Amazon, to do the same. But for all of IBM's recommendations — including investing in education and holding police accountable for misconduct — the company stopped short of cutting its many other ties with police departments. That includes the sale of artificial intelligence tools that enable police to predict criminal activity, a practice researchers and advocates say can entrench and exacerbate racial bias in policing.

In his letter, Krishna wrote that the company opposes the use of "any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms, or any purpose which is not consistent with our values."

But IBM spokesperson Adam Pratt declined to comment on whether that will have any impact on the company's predictive policing work, saying that the letter was clear on what IBM supports and opposes. Pratt suggested, instead, that Protocol write about how IBM is "stepping up with some of the most detailed policy recommendations we've seen yet from the business community to advance racial justice for the black community and respond to this critical moment in our nation's history."

On its website, IBM touts its policing work, including case studies from Rochester, New York; Miami-Dade, Florida; Durham, North Carolina; and Manchester, New Hampshire. In Miami-Dade County, IBM said it developed technology to help the police "identify crime hot spots" and "model what kind of suspect typically commits a particular crime and then generate and filter a suspect list to help solve cases faster." In Rochester, IBM said its technology gives police making traffic stops insight into not just who owns a given car, but who might be riding in it, painting a "comprehensive picture of an individual, their associates and their activity."

Experts say the problem with making crime or threat predictions based on historical policing data is that the data itself is a reflection of the disproportionate presence of police in black and brown people's lives. In a 2019 paper, Rashida Richardson and her colleagues at New York University's AI Now Institute called this "dirty data" because it's based on "data produced during documented periods of flawed, racially biased, and sometimes unlawful practices and policies." That can include data on calls for service in a given area or even arrest and stop data that omits whether a person was ever convicted of a crime.

"In policing, the data is going to reflect its environment," Richardson said. "If you have a culture of policing that has racial biases evident in its practices and outcomes and a culture that seems to have a complete disregard for entire parts of the population, that's going to be reflected in the data."

Some police departments have taken it upon themselves to terminate predictive policing programs. Last year, the Los Angeles Police Department shut down a controversial program known as LASER, which used technology developed by Palantir to compile a watch list for police. An internal audit found that some 84% of people considered "active chronic offenders" under LASER were black or Latino, and that nearly half of the supposed chronic offenders had never been arrested for a violent crime or had been arrested only once.

Richardson called IBM's letter a "PR move," in part, because it doesn't grapple with the other ways its AI tools are used in policing and, in part, because IBM wasn't the biggest player offering facial recognition technology to law enforcement to begin with. "It's easy to end selling a product that wasn't one of your major profit-drivers," Richardson said, "but it's also harder to take their move seriously when they didn't seem to comment on and don't seem to have an interest in divesting from predictive policing."

In his letter, Krishna wrote that both vendors and users of AI systems in policing need to test their technology for racial bias and have those tests audited. This, coupled with IBM's stance on facial recognition, is an important first step, said Albert Fox Cahn, founder and executive director of the Surveillance Technology Oversight Project, which opposes police surveillance. But, he said, it's only a first step.

"I think the same moral calculus that required IBM to ditch biased and broken facial recognition software will lead them to eventually walk away from the other ways that artificial intelligence can exacerbate police violence and civil rights abuses," Cahn said. "I don't think this is the end of this debate, but I do think it's a lead other tech companies need to follow."

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins