This Senate bill would force companies to audit AI used for housing and loans

The Algorithmic Accountability Act gives the FTC more tech staff and follows an approach to AI accountability already promoted by key advisers inside the agency.

Ron Wyden

The bill would require companies deploying automated systems to assess them, mitigate negative impacts and submit annual reports about those assessments to the FTC.

Photo: Alex Wong/Getty Images

Legislation introduced last week would require companies to assess the impact of AI and automated systems they use to make decisions affecting people’s employment, finances, housing and more.

The Algorithmic Accountability Act of 2022, sponsored by Oregon Democratic Sen. Ron Wyden, would give the FTC more tech staff to oversee enforcement and let the agency publish information about the algorithmic tech that companies use. In fact, it follows an approach to AI accountability and transparency already promoted by key advisers inside the FTC.

Algorithms used by social media companies are often the ones in the regulatory spotlight. However, all sorts of businesses — from home loan providers and banks to job recruitment services — use algorithmic systems to make automated decisions. In an effort to enable more oversight and control of technologies that make discriminatory decisions or create safety risks or other harms, the bill would require companies deploying automated systems to assess them, mitigate negative impacts and submit annual reports about those assessments to the FTC.

“This is legislation that is focused on creating a baseline of transparency and accountability around where companies are using algorithms to legal effect,” said Ben Winters, a counsel at Electronic Privacy Information Center who leads the advocacy group’s AI and Human Rights Project.

The bill, an expanded version of the same legislation originally introduced in 2019, gives state attorneys general the right to sue on behalf of state residents for relief. Lead co-sponsors of the bill are Sen. Cory Booker of New Jersey and Rep. Yvette Clarke of New York, and it is also co-sponsored by Democratic Sens. Tammy Baldwin of Wisconsin, Bob Casey of Pennsylvania, Brian Schatz and Mazie Hirono of Hawaii and Martin Heinrich and Ben Ray Luján of New Mexico.

The onus is on AI users, not vendors

Companies using algorithmic technology to make “critical decisions” that have significant effects on people’s lives relating to education, employment, financial planning, essential utilities, housing or legal services would be required to conduct impact assessments. The evaluations would entail ongoing testing and analysis of decision-making processes, and require companies to supply documentation about the data used to develop, test or maintain algorithmic systems.

The legislation calls on companies using this tech to conduct algorithmic impact assessments of those systems. In particular, companies using automated systems would be on the hook if they make more than $50 million in annual revenue or are worth more than $250 million in equity and have identifiable data on more than one million people, households or devices used in automated decisions.

That would mean a large number of companies in the health care, recruitment and human resources, real estate and financial lending industries would be required to conduct assessments of AI they use.

Suppliers of algorithmic tools also would have to conduct assessments if they expect them to be used for a critical decision. However, Winters said it makes the most sense to focus on users rather than vendors.

“The bill focuses on the impact of the algorithmic systems, and the impact depends on the context in which it is used,” he said. Vendors selling algorithmic systems might only assess those tools according to perfect use cases rather than in relation to how they are used in more realistic circumstances, he said.

The new version of the bill is 50 pages long, with far more detail than its 15-page predecessor. One key distinction involves the language used to determine whether technologies are covered by the legislation. While the previous version would have required assessments of “high-risk” systems, the updated bill requires companies to evaluate the impact of algorithmic tech used in making “critical decisions.”

“Focusing on the decision’s effect is a lot more concrete and effective,” said Winters. Because it would have been difficult to determine what constitutes a high-risk system, he said the new approach is “less likely to be loopholed” or argued into futility by defense lawyers.

A statement about the bill from software industry group BSA The Software Alliance indicated the risk factor associated with algorithmic systems could be a sticking point. “We look forward to working with the sponsors and committees of jurisdiction to improve on the legislation to ensure that any new law clearly targets high risk systems and sensibly allocates responsibilities between organizations that develop and deploy them,” wrote Craig Albright, vice president of Legislative Strategy at the trade group.

FTC reports and public data on company AI use

Companies covered by the bill also would have to provide annual reports about those tech assessments. Then, based on those company reports, the FTC would produce annual reports showing trends, aggregated statistics, lessons from anonymized cases and updated guidance.

Plus, the FTC would publish a publicly available repository of information about the automated systems companies provide reports on. It is unclear, however, whether the contents of assessment reports provided to the FTC would be available via Freedom of Information Act requests.

The FTC would be in charge of making rules for algorithmic impact assessments if the bill passes, and the agency has already begun to gear up for the task. The commission brought on members of a new AI advisory team who have worked for the AI Now Institute, a group that has been critical of AI’s negative impacts on minority communities and has helped propel the use of algorithmic impact assessments as a framework for evaluating the effects of algorithmic systems. The co-founder of AI Now, Meredith Whittaker, is senior adviser on AI at the FTC.

“Given that the bill instructs the FTC to build out exactly what the impact assessments will require, it’s heartening that the current FTC AI team is led by individuals that are particularly well suited to draw the bounds of a meaningful algorithmic impact assessment,” said Winters.

The bill also provides resources to establish a Bureau of Technology to enforce the would-be law, including resources for hiring 50 people and a chief technologist for the new bureau, in addition to 25 more staff positions at the agency’s Bureau of Consumer Protection.

Many federal bills aimed at policing algorithmic systems have focused on reforming Section 230 to address social media harms rather than addressing technologies used by government or other types of companies. For instance, the Algorithmic Justice and Online Platform Transparency Act of 2021 would require digital platforms to maintain records of how their algorithms work. The Protecting Americans from Dangerous Algorithms Act would hold tech companies liable if their algorithms amplify hateful or extremist content.

Though lawmakers have pushed for more oversight of algorithms, there’s no telling whether the Wyden bill will gain momentum in Congress. But if it passes, it would likely benefit a growing sector of AI auditing and monitoring services that could provide impact assessments.

There’s also action among states to create more algorithmic accountability. EPIC is backing a Washington State bill that would create guidelines for government use of automated decision systems. That bill currently is “getting a lot of government office and agency pushback,” said Winters, who testified in January in a committee hearing about the legislation.

This post was updated to include the lead co-sponsors of the bill, and corrected to clarify that suppliers of AI-based tech will be required to conduct algorithmic impact assessments.


The minerals we need to save the planet are getting way too expensive

Supply chain problems and rising demand have sent prices spiraling upward for the minerals and metals essential for the clean energy transition.

Critical mineral prices have exploded over the past year.

Photo: Andrey Rudakov/Bloomberg via Getty Images

The newest source of the alarm bells echoing throughout the renewables industry? Spiking critical mineral and metal prices.

According to a new report from the International Energy Agency, a maelstrom of rising demand and tattered supply chains have caused prices for the materials needed for clean energy technologies to soar in the last year. And this increase has only accelerated since 2022 began.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Sponsored Content

Why the digital transformation of industries is creating a more sustainable future

Qualcomm’s chief sustainability officer Angela Baker on how companies can view going “digital” as a way not only toward growth, as laid out in a recent report, but also toward establishing and meeting environmental, social and governance goals.

Three letters dominate business practice at present: ESG, or environmental, social and governance goals. The number of mentions of the environment in financial earnings has doubled in the last five years, according to GlobalData: 600,000 companies mentioned the term in their annual or quarterly results last year.

But meeting those ESG goals can be a challenge — one that businesses can’t and shouldn’t take lightly. Ahead of an exclusive fireside chat at Davos, Angela Baker, chief sustainability officer at Qualcomm, sat down with Protocol to speak about how best to achieve those targets and how Qualcomm thinks about its own sustainability strategy, net zero commitment, other ESG targets and more.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.


The 911 system is outdated. Updating it to the cloud is risky.

Unlike tech companies, emergency services departments can’t afford to make mistakes when migrating to the cloud. Integrating new software in an industry where there’s no margin for error is risky, and sometimes deadly.

In an industry where seconds can mean the difference between life and death, many public safety departments are hesitant to take risks on new cloud-based technologies.

Illustration: Christopher T. Fong/Protocol

Dialing 911 could be the most important phone call you will ever make. But what happens when the software that’s supposed to deliver that call fails you? It may seem simple, but the technology behind a call for help is complicated, and when it fails, deadly.

The infrastructure supporting emergency contact centers is one of the most critical assets for any city, town or local government. But just as the pandemic exposed the creaky tech infrastructure that runs local governments, in many cases the technology in those call centers is outdated and hasn’t been touched for decades.

Keep Reading Show less
Aisha Counts

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.


'The Wilds' is a must-watch guilty pleasure and more weekend recs

Don’t know what to do this weekend? We’ve got you covered.

Our favorite things this week.

Illustration: Protocol

The East Coast is getting a little preview of summer this weekend. If you want to stay indoors and beat the heat, we have a few suggestions this week to keep you entertained, like a new season of Amazon Prime’s guilty-pleasure show, “The Wilds,” a new game from Horizon Worlds that’s fun for everyone and a sneak peek from Adam Mosseri into what Instagram is thinking about Web3.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.


Work expands to fill the time – but only if you let it

The former Todoist productivity expert drops time-blocking tips, lofi beats playlists for concentrating and other knowledge bombs.

“I do hope the productivity space as a whole is more intentional about pushing narratives that are about life versus just work.”

Photo: Courtesy of Fadeke Adegbuyi

Fadeke Adegbuyi knows how to dole out productivity advice. When she was a marketing manager at Doist, she taught users via blogs and newsletters about how to better organize their lives. Doist, the company behind to-do-list app Todoist and messaging app Twist, has pushed remote and asynchronous work for years. Adegbuyi’s job was to translate these ideas to the masses.

“We were thinking about asynchronous communication from a work point of view, of like: What is most effective for doing ambitious and awesome work, and also, what is most advantageous for living a life that feels balanced?” Adegbuyi said.

Keep Reading Show less
Lizzy Lawrence

Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She's a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school's independent newspaper. She's based in D.C., and can be reached at llawrence@protocol.com.

Latest Stories