Source Code: Your daily look at what matters in tech.

source-codesource codeauthorIssie LapowskyNoneWant your finger on the pulse of everything that's happening in tech? Sign up to get David Pierce's daily newsletter.64fd3cbe9f
×

Get access to Protocol

Will be used in accordance with our Privacy Policy

I’m already a subscriber
Bulletins

FTC issues stern warning: Biased AI may break the law

FTC issues stern warning: Biased AI may break the law

Acting FTC chair Rebecca Slaughter.

Photo: David Becker/Getty Images

In a blog post this week, the Federal Trade Commission signaled that it's taking a hard look at bias in AI, warning businesses that selling or using such systems could constitute a violation of federal law.


"The FTC Act prohibits unfair or deceptive practices," the post reads. "That would include the sale or use of – for example – racially biased algorithms."

The post also notes that biased AI can violate the Fair Credit Reporting Act and the Equal Credit Opportunity Act. "The FCRA comes into play in certain circumstances where an algorithm is used to deny people employment, housing, credit, insurance, or other benefits," it says. "The ECOA makes it illegal for a company to use a biased algorithm that results in credit discrimination on the basis of race, color, religion, national origin, sex, marital status, age, or because a person receives public assistance."

The post mirrors comments made by acting FTC chair Rebecca Slaughter, who recently told Protocol of her intention to ensure that FTC enforcement efforts "continue and sharpen in our long, arduous and very large national task of being anti-racist."

"For me, algorithmic bias is an economic justice issue. We see disparate outcomes coming out of algorithmic decision-making that disproportionately affect and harm Black and brown communities and affect their ability to participate equally in society," Slaughter said at the time. "That's something we need to address, and we need to think about whether it fits under our unfairness framework, whether we might have rule-making authority that could apply to it or whether we use statutes like the Equal Credit Opportunity Act or the Fair Credit Reporting Act."

In the blog post, the FTC urges businesses to test their algorithms for bias, to base their systems on datasets that aren't missing key demographics and to avoid exaggerating what those systems can do, among other things. On Twitter, University of Washington School of Law professor Ryan Calo called the post a "shot across the bow."

"I see this blog post as a signaling a shift in the way the FTC thinks about enforcing the FTC Act in the context of emerging technology," Calo, who co-directs the UW Tech Policy Lab, tweeted. "The concreteness of the examples coupled with repeated references to statutory authority is uncommon."

Indeed, the post concludes with a clear-eyed message to businesses. "Hold yourself accountable," it reads, "or be ready for the FTC to do it for you."

Latest Stories