We submitted the following comments to the Consumer Financial Protection Bureau in response to their Request for Information Regarding Data Brokers and Other Business Practices Involving the Collection and Sale of Consumer Information (published on March 21, 2023). We submitted these comments via regulations.gov on July 14, 2023.
RE: CFPB Request for Information Regarding Data Brokers (Docket No. CFPB-2023-0020)
We write to provide comments in response to the Consumer Financial Protection Bureau (CFPB or Bureau)’s Request for Information (RFI) Regarding Data Brokers and Other Business Practices Involving the Collection and Sale of Consumer Information, published on March 21, 2023.
Upturn is a non-profit organization that advances equity and justice in the design, governance, and use of technology. Through research and advocacy, we drive policy change by investigating specific ways that technology and automation shape people’s opportunities, particularly in historically disadvantaged communities.
Our comments draw upon our recent research and advocacy in two issue areas: labor and employment, and housing. Specifically, we aim to provide the Bureau with some concrete examples of how the data broker industry impacts access to jobs and housing in the United States. Our comments, therefore, are geared toward the Market-Level Inquiries set out in the RFI.
Background: The Trouble with Data Brokers
The size and scope of the data broker industry has exploded over the past decade. This year, hundreds of companies are expected to generate billions in revenues. The data broker industry has also changed significantly as the Internet, online services, and other technological advancements have allowed companies to aggregate data about people much more efficiently.
As the Bureau notes in its RFI, “[d]ata brokers collect or share a vast range of information, often building profiles of individuals by delving into the details of consumers’ everyday interactions, including credit card purchases and web browsing activity.” They “also collect other types of sensitive and intimate personal information such as genetic and health information, religious affiliation, financial records, and geolocation data.” The FTC has explained that “[w]hile each data broker source may provide only a few data elements about a consumer’s activities, data brokers can put all of these data elements together to form a more detailed composite of the consumer’s life.”
One of the core products data brokers offer is background checks — composite reports, compiled from a variety of sources, which are then used to evaluate people for jobs, housing, and other important life opportunities. The background check industry, like other aspects of the data broker industry, involves “multiple layers of data brokers providing data to each other.” Data brokers that sell background check reports to end users (like employers or landlords) obtain their records from other data brokers, such as those that maintain (and sell access to) court records databases. Those data brokers, in turn, obtain their records from networks of smaller data brokers, such as contractors who specialize in collecting records from local court systems. Some background check providers are subsidiaries of larger data brokers, such as LexisNexis and TransUnion.
The data broker industry has aggressively marketed background screening products to housing providers and employers as indispensable tools for ensuring safety and legal compliance. Their business model has been supported by permissive legal structures, such as draconian war-on-drugs-era laws requiring certain criminal background screening for housing. As a result, background checks have become pervasive barriers to economic opportunity, locking people out of stable housing and jobs, compounding economic insecurity, and exposing people to further criminalization. Yet, despite these significant impacts, the background check industry is often extremely opaque and overwhelming for consumers, who often have no idea what records data brokers have compiled about them (much less how those records have been used against them).
The ubiquity and underregulation of the background check industry leave consumers vulnerable to a variety of harms and abuses. Poor people, immigrants, people with disabilities, and BIPOC experience these harms disproportionately.
The Records Used for Background Checks Are Inaccurate and Misleading
Given that inaccuracies in background checks can wrongfully deprive people of important life opportunities, such as jobs and housing, ensuring the accuracy of such reports is a foundational goal of the Fair Credit Reporting Act (FCRA). See, e.g., Guimond v. Trans Union Credit Info. Co., 45 F.3d 1329, 1333 (9th Cir. 1995) (“[The FCRA] was crafted to protect consumers from the transmission of inaccurate information about them and to establish credit reporting practices that utilize accurate, relevant, and current information in a confidential and responsible manner.”). FCRA thus requires companies that prepare background checks to “follow reasonable procedures to assure maximum possible accuracy of the information” within reports on consumers. 15 U.S.C. § 1681e(b).
Today, thousands of background screening companies operate a multi-billion-dollar industry with insufficient government oversight, regulation, or enforcement. The companies actively assemble private and proprietary databases, often with the assistance of third-party vendors. The data contained in these databases is generally purchased in bulk, often through intermediaries, from a variety of sources: law enforcement agencies, state courts, corrections offices, criminal record repositories, or public websites (via web scraping technology). Data brokers apply “matching” criteria to the data in their possession when running background checks; such criteria, however, are often insufficient to prevent falsely associating individuals with records that do not, in fact, belong to them. Indeed, data brokers often favor an overinclusive approach that results in errors. Additionally, the data amassed in these databases may be infrequently or sporadically acquired and updated, leading to stale, outdated information appearing on background reports.
Common examples of recurring errors and inaccuracies include: matching (and ultimately reporting) the wrong person’s criminal record or eviction record (a “mismatch” or false positive); misclassifying the severity of an offense (e.g., reporting misdemeanors as felonies, and infractions as misdemeanors); reporting incomplete information (e.g., omitting court action subsequent to arrest or conviction, or after an eviction action has been filed); reporting sealed, expunged, or obsolete records; and reporting a single incident with multiple criminal charges as separate incidents.
Background checks today commonly include interpretations or conclusions about the risk of accepting an applicant. Tenant screening reports, for example, often present a three-digit score or a recommendation about whether a rental applicant is qualified. These recommendations are based on the same flawed data described above, and are themselves misleading and inaccurate.
These inaccuracies have a disparate and devastating impact on Black and Latine people and other communities of color. As a result of over-policing, Black and Latine people are disproportionately likely to have an arrest or conviction record and consequently to face exclusion from jobs and housing based on those records. And inaccuracies stemming from name mismatches disparately impact racialized communities, who face a higher likelihood of such mismatches due to “clustering” of common surnames.
Even when records are accurate, they aren’t reliable in predicting employment or tenancy outcomes. As discussed in greater detail below, the records commonly included in background checks — criminal, credit, and civil court/eviction records — contain little to no valid information about someone’s likelihood of being a good employee or tenant.
Background Checks Limit Access to Jobs
Background checks can be a significant barrier to employment for those who need it most. They often include credit reports and criminal records, which reflect racial discrimination in the financial and criminal legal systems. Even a request for consent to conduct a background check can dissuade job applicants with poor credit histories or criminal records from submitting an application. As explained, background check reports are also notoriously plagued with errors, such as records matched to the wrong person and inaccurate case dispositions. And researchers have found that criminal histories or low credit scores do not predict job performance.
In 2021, Upturn published a report called Essential Work: Analyzing the Hiring Technologies of Large Hourly Employers. We conducted empirical research about the technologies that applicants for low-wage hourly jobs encounter each day, submitting online applications to fifteen employers and scrutinizing each process. We found that many such employers used an Applicant Tracking System (ATS) to administer a range of selection procedures, including screening questions and psychometric tests. Most employers we examined used an ATS capable of integrating with a range of background screening vendors, including those providing social media screens, criminal background checks, credit checks, drug and health screenings, and I-9 and E-Verify. As applicants, however, we had no way of knowing which, if any, background check systems were used to evaluate our applications. Employers provided no meaningful feedback or explanation when an offer of work was not extended. Thus, a job candidate subjected to a background check may have no opportunity to contest the data or conclusions derived therefrom.
Background check companies “are quick to cite individuals’ ‘consent’ in these kinds of situations . . . but a market and regulatory environment in which workers are essentially powerless to the sharing and monetization of their own information is not one in which that consent is fully, informed, and freely given.”
Some regulators have recognized and acted on these pressing concerns. Several states have enacted legislation to prohibit or limit pre-employment criminal background and credit checks. But far more can and should be done to protect job seekers from the vagaries of the background check industry.
Background Checks Limit Access to Housing
Almost every renter must go through a background check process (often called “tenant screening”) to access rental housing. The costs of these background checks are passed onto rental applicants as non-refundable application fees, and renters sometimes pay hundreds of dollars in fees for a single housing search. Tenant screening companies and landlords rely heavily on eviction, credit, and criminal records to make housing decisions — all three of which reflect structural discrimination, are frequently inaccurate, and lack reliable information related to tenancy outcomes. The costs of tenant screening — in terms of application fees and in terms of access to stable housing — disproportionately impact renters across all protected classes, and especially Black renters and renters who receive subsidized housing assistance.
Tenant screening reports drive housing insecurity and discrimination by entrenching criminal, credit, and eviction histories as universal barriers to housing. Tenant screening companies further exacerbate this effect by using automated systems to generate eligibility determinations, such as three-digit scores, risk assessments, predictions about tenancy outcomes, and recommendations to landlords about whether a rental applicant is qualified. Housing background checks undermine government policies and funding efforts to improve fair and equitable access to affordable housing. For example, many housing vouchers go unused because voucher holders are consistently screened out of housing opportunities due to records in their backgrounds, even though their voucher directly demonstrates their ability to pay rent.
The CFPB must take immediate action to protect consumers from the documented harms of the data broker industry. In addition to updating existing regulations and promulgating new ones that better reflect the state of play as of today, the Bureau should issue advisory opinions on pressing questions related to consumer privacy and consumer rights vis-a-vis data brokers.
The CFPB must continue to emphasize that data broker companies are, in fact, bound by the FCRA, including the Act’s privacy protections. The Bureau should undertake enforcement actions through which data brokers are required to delete all wrongfully obtained data, make monetary restitution to those harmed, and otherwise ensure consumers are “made whole.” Likewise, the CFPB must emphasize that the FCRA’s privacy protections do not have any law enforcement “exception,” thus preventing data brokers from selling data to law enforcement without adhering to the Act’s protective requirements.
The CFPB can also clarify that “credit header data” (which can include an individual’s address, date of birth, social security number, and phone number(s)) may not be sold to third parties.
Finally: two common data broker practices raise substantial accuracy concerns: (1) matching and reporting court records, including criminal and eviction records, and (2) relying on automated processes to retrieve and match records. Given that the evidence on reporting errors overwhelming suggests that background check companies simply cannot meet reasonable accuracy requirements when reporting court records, the CFPB should clarify that these record matching practices do not meet reasonable accuracy standards under the FCRA. The Bureau and the “Big Three” nationwide consumer reporting agencies have asserted that at least three identifiers must be used to match people with records. No data brokers should be permitted to adhere to lower standards (e.g., rely on only two identifiers, such as name and address). Likewise, the CFPB should clarify that relying on automated processes to match people with records and include those records on consumer reports violates the FCRA. Fully automated matching, without manual accuracy verifications, allows errors to proliferate unchecked. The Bureau can and must protect consumers from these harms.