CFPB Acts to Protect the Public from Black-Box Credit Models Using Complex Algorithms

Today, the Consumer Financial Protection Bureau (CFPB) confirmed that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. The CFPB published a Consumer Financial Protection Circular to remind the public, including those responsible for enforcing federal consumer financial protection law, of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

Data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them. Many firms across the economy rely on these detailed datasets to power their algorithmic decision-making, which is sometimes marketed as “artificial intelligence.” The information gleaned from data analytics has a broad range of commercial uses by financial firms, including for targeted advertising and in credit decision-making.

Read more…