[ad_1]
Federal anti-discrimination laws in banking are supposed to protect consumers from credit companies from denying an application for credit or taking other adverse actions without just cause or a clear explanation on why the action was taken.
Companies have long used advanced computational methods as part of their credit decision making process, but as technology evolves, so do these models. Some creditors are now making decisions based on a model called “black box” models. This model produces outputs that may be unknown to the model’s users, including its own creators.
As such, adverse action notices that meet Equal Credit Opportunity Act (ECOA) requirements from these models may not be possible.
In order to remind the public of these requirements, the CFPB published a Consumer Financial Protection Circular, including those responsible for enforcing federal consumer financial protection law, of creditors’ adverse action notice requirements under the ECOA.
“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said Rohit Chopra, Director of the CFPB. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”
According to the CFPB, data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them. Many firms across the economy rely on these detailed datasets to power their algorithmic decision-making, which is sometimes marketed as “artificial intelligence.” The information gleaned from data analytics has a broad range of commercial uses by financial firms, including for targeted advertising and in credit decision-making.
The circular highlights two major points:
- Federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors. For example, ECOA does not permit creditors to use technology that prevents them from providing specific and accurate reasons for adverse actions. Creditors’ use of complex algorithms should not limit enforcement of ECOA or other federal consumer financial protection laws.
- Creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new. Creditors who use complex algorithms—including artificial intelligence or machine learning technologies—to engage in credit decisions must still provide a notice that discloses the specific, principal reasons for taking adverse actions. There is no exception for violating the law because a creditor is using technology that has not been adequately designed, tested, or understood.
“Whistleblowers play a central role in uncovering information about companies using technologies, like black-box models, in ways that violate ECOA and other federal consumer financial protection laws,” the CFPB concluded. “Having clear, actionable information is critical for the CFPB and other consumer protection enforcers. The CFPB encourages tech workers to provide the agency with information, and they can visit the CFPB’s Whistleblower Program webpage to learn more.”
Click here to read the circular in its entirety.
[ad_2]
Source link