Urging the Biden Administration to Address Technology’s Role in Financial Services Discrimination
Harlan Yu, Aaron Rieke, and Natasha Duarte
LetterUpturn, ACLU, The Leadership Conference on Civil and Human Rights, and a coalition of other organizations urge the White House Office of Science & Technology Policy to fully incorporate the Biden administration’s commitment to racial equity into its AI and technology priorities.
This is our memo on technology's role in financial services discrimination. We sent this memo to:
Acting Director Dave Uejio, Consumer Financial Protection Bureau (CFPB)
Chair Jerome H. Powell, Board of Governors of the Federal Reserve System
Chair Jelena McWilliams, Federal Deposit Insurance Corporation
Chair Todd M. Harper, Board of Directors, National Credit Union Administration
Acting Comptroller of the Currency Michael J. Hsu, Office of the Comptroller of the Currency, U.S. Department of the Treasury
Acting Director Sandra L. Thompson, Federal Housing Finance Agency
Secretary Marcia L. Fudge, U.S. Department of Housing and Urban Development
Chair Lina Khan, Federal Trade Commission
Assistant Attorney General Kristen Clarke, Civil Rights Division, U.S. Department of Justice
Upturn was joined by the following signatories and contributors to this letter:
American Civil Liberties Union
Center for Democracy & Technology
Center on Privacy & Technology at Georgetown Law
Lawyers’ Committee for Civil Rights Under Law
National Consumer Law Center (on behalf of its low-income clients)
National Fair Housing Alliance
This is one of three memos on technology and discrimination that we sent to the Biden administration — see our letter to the White House Office of Science & Technology Policy; and our other two memos on housing and hiring discrimination.
RE: Addressing Technology’s Role in Financial Services Discrimination
Too often, technology amplifies and exacerbates racial, gender, disability, economic, and intersectional inequity in our society. Governments and corporations, at the national, state, and local level, are using computer software, statistical models, assessment instruments, and other tools to make important decisions in areas such as employment, health, credit, housing, immigration, and the criminal legal system. In light of these developments, policymakers must take steps to ensure non-discriminatory and equitable outcomes for all who participate in the financial services market.
We offer the following proposals for the Biden-Harris administration and federal financial regulators for addressing the ways that technology and data can lead to discrimination in consumer credit. We urge all agencies to engage with a diverse range of stakeholders, including civil rights organizations, consumer advocates, and impacted communities, in order to receive ongoing input and feedback on these important issues. We also encourage agencies to prioritize transparency, by sharing their data, models, decisions, and proposed solutions so that all of the stakeholders can stay apprised of and comment on the potential impact of proposed actions, and by requiring financial institutions to share with the public as much information as possible regarding their systems and assessments of those systems. We would be pleased to discuss the ideas in this memo in more detail in the weeks and months ahead.
1. The CFPB and other agencies should ensure robust measurement and remediation of discrimination.
Existing civil rights laws allow agencies to analyze fair lending risk and to engage in supervisory or enforcement actions concerning the use of new technologies. For example, the Equal Credit Opportunity Act (ECOA), as implemented by Regulation B, prohibits creditor practices that have discriminatory effects unless they meet a “legitimate business need that cannot reasonably be achieved as well by means that are less disparate in their impact.” However, agencies must set clearer and more robust expectations concerning fair lending risk assessments as they pertain to technologies, and conduct in-depth reviews of financial institutions’ use of these technologies in order to more effectively supervise institutions and enforce civil rights laws.
More specifically, agencies should develop policies that:
Set updated standards for fair lending assessments, including discrimination testing and evaluation in the conception, design, implementation, and use of models; and for what information must be detailed in documentation of fair lending risk assessments, including what testing has been conducted, in-depth information regarding training data, and documentation of adverse action notices;
Clarify that financial institutions’ fair lending assessments should be be conducted by independent actors within the institution or a third party;
State that the agencies will conduct their own fair lending risk assessments, including a review of disparate impact, business justifications, and less discriminatory alternatives;
Define “model risk” to include the risk of discriminatory or inequitable outcomes for consumers, rather than just the risk of financial loss to a financial institution; and
Establish documentation and archiving requirements sufficient to ensure that financial institutions maintain the data, code, and information necessary for agencies to review their systems.
It is critical that all credit processes undergo scrutiny that includes analysis of actual outcome data, not just a model’s inputs, training, or validation data. This is particularly important as lenders turn to more complex models that exhibit “black box” qualities — i.e., where the relationship between the model’s inputs and outputs are opaque or not easily understood. This requires access to demographic data about protected groups, whether inferred indirectly or collected directly by the creditor. The Bureau should consider creating new datasets or methodologies for measuring disparate impact. This might include, for example, improving on the BISG methodology. The CFPB should consider new supervisory guidance for models, revisions to exam manuals, and commentary to Regulation B to achieve these goals.
Agencies should also help develop industry practices for identifying and adopting underwriting processes with minimal adverse impact. This is an area of great potential, but with few established standards or practices. For example, using new modeling techniques, creditors can sometimes discover more equitable models without significant loss of overall model quality. Creditors should proactively explore these tradeoffs, and adopt alternative models to reduce adverse impact where feasible. The CFPB can lead the way on this issue by offering new guidance, leading workshops, encouraging the development of methodologies and techniques, and offering new policy guidance.
2. Agencies should encourage the use of alternative data for underwriting that is voluntarily provided by consumers and has a clear relationship to their ability to repay a loan.
All agencies should encourage the use of alternative data for credit underwriting where such data are voluntarily provided by consumers and have a clear relationship with those consumers’ ability to repay. This might involve new research about the suitability of different kinds of alternative data, regulatory guidance, policy statements, or rulemakings.
Traditional credit history scores reflect immense racial disparities due to extensive historical and ongoing discrimination. Black and Latinx consumers are less likely to have credit scores in the first place, limiting their access to financial services. There is an obvious need for better, fairer, and more inclusive measures of creditworthiness.
New data sources can help. But caution is in order: Not all kinds of data will lead to more equitable outcomes, and some can even introduce their own new harms. Fringe alternative data such as online searches, social media history, and colleges attended can easily become proxies for protected characteristics, may be prone to inaccuracies that are difficult or impossible for impacted people to fix, and may reflect long standing inequities. On the other hand, recent research indicates that more traditional alternative data such as cash flow data holds promise for helping borrowers who might otherwise face constraints on their ability to access credit. For example, a recent Interagency Statement observed that “[c]ash flow data are specific to the borrower and generally derived from reliable sources, such as bank account records, which may help ensure the data’s accuracy. Consumers can expressly permit access to their cash flow data, which enhances transparency and consumers’ control over the data.”
3. Agencies should clarify standards for Special Purpose Credit programs, which could help address legacies of discrimination and encourage creation of less discriminatory credit models.
Congress has provided for a range of credit programs that are “specifically designed to prefer members of economically disadvantaged classes” and “to increase access to the credit market by persons previously foreclosed from it.” These “Special Purpose Credit Programs” (SPCPs) allow the consideration of a prohibited basis such as race, national origin, or sex under particular circumstances, without violating ECOA’s general anti-discrimination mandates. SPCPs have the potential to help address legacies of discrimination, and can aid in the development of fairer financial technologies that explicitly consider protected class status.
We urge relevant federal agencies and departments to encourage use of SPCPs by clarifying standards and reducing risk of liability for lenders. The CFPB should take the lead in ensuring that its SPCP guidance under ECOA is consistent with the approach that other federal regulators, the Department of Justice, and the state attorneys general are taking in enforcing other fair lending laws.
4. The CFPB should issue new modernized guidance for financial services advertising.
For years, creditors have known that new digital advertising technologies, including a vast array of targeting techniques, might result in illegal discrimination. Moreover, recent empirical research has shown that advertising platforms themselves can introduce significant skews on the basis of race, gender, or other protected group status through the algorithms they use to determine delivery of advertisements — even when advertisers target their advertisements broadly. The Department of Housing and Urban Development has alleged such practices violate the Fair Housing Act.
The CFPB should issue new guidance on advertising and discrimination for creditors under Regulation B. Even as the Bureau clarifies what kinds of modern marketing practices might violate the ECOA, it should also expand on ways that creditors can affirmatively reach out to underserved populations.
5. The CFPB should revise and reincorporate the underwriting provisions of its 2017 payday lending rule.
Predatory lenders thrive online, targeting poor and vulnerable consumers — and especially people of color — wherever they live. In July of 2020, the CFPB rescinded the mandatory underwriting provisions of its payday lending rule, removing critical requirements that payday lenders verify borrowers’ ability to repay. The CFPB should work quickly to reverse these misguided policy changes.
Thank you for your attention to these matters. For any questions or further discussion, please contact Aaron Rieke, Managing Director, Upturn, at 202-677-2359 or aaron@upturn.org.
1
12 C.F.R. pt. 1002, Supp. I, § 1002.6(a)-2.
1
Some of the undersigned have submitted a detailed response to the Request for Information and Comment on Financial Institutions’ Use of Artificial Intelligence, including Machine Learning issued by the financial regulatory institutions, which provides more detailed recommendations.
1
See, e.g., Nicholas Schmidt and Bryce Stephens, An Introduction to Artificial Intelligence and Solutions to the Problems of Algorithmic Discrimination, November 2019, available at https://arxiv.org/ftp/arxiv/papers/1911/1911.05755.pdf.
1
See Miranda Bogen, Aaron Rieke, and Shazeda Ahmed, Awareness in Practice: Tensions in Access to Sensitive Attribute Data for Antidiscrimination, December 2019, https://arxiv.org/abs/1912.06171.
1
See, e.g., CFPB, Using publicly available information to proxy for unidentified race and ethnicity, 2014, https://www.consumerfinance.gov/data-research/research-reports/using-publicly-available-information-to-proxy-for-unidentified-race-and-ethnicity/.
1
Id.
1
12 CFR Part 1002.
1
See, e.g., National Community Reinvestment Coalition, Response to Request for Information on the Equal Credit Opportunity Act; Docket No. CFPB-2020-0026, December 2020, available at https://ncrc.org/download/85646/.
1
See, e.g., National Consumer Law Center, Past Imperfect: How Credit Scores and Other Analytics “Bake In” and Perpetuate Past Discrimination, May 2016, https://www.nclc.org/images/pdf/credit_discrimination/Past_Imperfect050616.pdf.
1
CFPB Office or Research, Data Point: Credit Invisibles, May 2015, https://files.consumerfinance.gov/f/201505_cfpb_data-point-credit-invisibles.pdf.
1
Chi Chi Wu, Reparations, Race, and Reputation in Credit: Rethinking the Relationship Between Credit Scores and Reports with Black Communities, August 7, 2020, https://medium.com/@cwu_84767/reparations-race-and-reputation-in-credit-rethinking-the-relationship-between-credit-scores-and-852f70149877.
1
See Testimony of Aaron Rieke Before the Task Force on Financial Technology United States House Committee on Financial Services, July 25, 2019, available at https://www.congress.gov/116/meeting/house/109867/witnesses/HHRG-116-BA00-Wstate-RiekeA-20190725.pdf.
1
See, e.g., FinRegLab, The Use of Cash-Flow Data in Underwriting Credit: Empirical Research Findings, July 2019, https://finreglab.org/cash-flow-data-in-underwriting-credit-empirical-research-findings.
1
Interagency Statement on the Use of Alternative Data in Credit Underwriting, https://files.consumerfinance.gov/f/documents/cfpb_interagency-statement_alternative-data.pdf.
1
S. Rept. 94-589, 94th Cong., 2nd Sess., at 7, reprinted in 1976 U.S.C.C.A.N. 403, 409.
1
CFPB, Advisory Opinion on Special Purpose Credit Programs, December 12, 2020, https://www.consumerfinance.gov/rules-policy/final-rules/advisory-opinion-on-special-purpose-credit-programs/.
1
See NCLC, Doing Special Purpose Credit Programs Right: Why Programs to Assist Black Communities Should Avoid Conventional Use of Traditional Credit Scores, February 2021, https://www.nclc.org/images/pdf/credit_reports/IB_SPCP_Credit_Scores.pdf; Stephen Hayes, Special Purpose Credit Programs, February 2021, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3749610.
1
See, e.g., Penny Crosman, ‘Black box’ problem hampers banks’ online marketing, American Banker, January 21, 2018, https://www.americanbanker.com/payments/payments/news/the-black-box-problem-should-financial-institutions-steer-clear-of-tools-like-facebooks-lookalike-audiences.
1
See Id.; Ali et. al., Discrimination through Optimization: How Facebook’s Ad Delivery Can Lead to Biased Outcomes, November 2019, https://dl.acm.org/doi/10.1145/3359301.
1
Department of Housing and Urban Development, Charge of Discrimination, March 2019, https://www.hud.gov/sites/dfiles/Main/documents/HUD_v_Facebook.pdf.
1
See, e.g., Upturn, Led Astray: Online Lead Generation and Payday Loans, October 2015, available at https://www.upturn.org/reports/2015/led-astray/.
Related Work
We sent a memo on technology’s role in hiring discrimination to agency leaders within the Biden administration.
Labor and EmploymentWe sent a memo to agency leaders in the Biden administration on technology’s role in housing discrimination.
HousingWe sent a letter urging the White House Office of Science & Technology Policy to fully incorporate the Biden administration’s commitment to racial equity into its AI and technology priorities.
Across the FieldWe explain how online lead generation works, describe the risks and legal complexities specific to lead generation for online payday loans, document the widespread use of search ads by payday lead generators, and recommend interventions.
Credit and Finance