The CFPB has a message for banks and lenders: You've got to explain your algorithms. The agency stressed in a new circular that financial services companies must give a clear explanation for denying a credit application and cannot simply argue that the systems they use are “too complicated.”
“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” CFPB Director Rohit Chopra said in a statement.
The Equal Credit Opportunity Act requires banks and lenders to offer “specific reasons for denying an application for credit or taking other adverse actions,” the CFPB said.
Technology, including AI, has enabled banks and fintech companies to make credit decisions based on more widely available financial data. “Data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them,” the CFPB said.
But the “reasoning” behind these “black-box models” may be “unknown” to those who use them for credit decision-making.
The CFPB said banks and lenders cannot say that they can’t comply with the law because "the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new.”
The move highlights the growing concern of federal regulators about the use of AI and other technologies both in the financial system and more broadly. Congress is concerned as well: Sen. Ron Wyden reintroduced a bill, the Algorithmic Accountability Act, that would require companies using AI to examine the impact of those systems.
Last year, five agencies, including the CFPB, the Federal Reserve Board and the Office of the Comptroller of the Currency, began soliciting comments on the use of data technology in financial services and to find out if AI is being deployed “in a safe and sound manner.”