January 15, 2022
Across the Field

Comments on OSTP's Biometrics Technologies RFI

Emily Paul and Harlan Yu

Comments

We submitted comments in response to the Office of Science and Technology Policy’s request for information on public and private sector uses of biometric technologies. The data-driven technologies (including biometric tech) used by powerful institutions to shape key decisions about people's lives often mirror and exacerbate historical racial and economic disparities in housing, employment, public benefits, the criminal legal system, and other areas of opportunity and wellbeing.



Office of Science and Technology Policy

Submitted via email to BiometricRFI@ostp.eop.gov

RE: Request for Information on Public and Private Sector Uses of Biometric Technologies (FR Doc. 2021-21975)

Thank you for the opportunity to respond to this Request for Information on Public and Private Uses of Biometric Technologies. Upturn is a research and advocacy group that works to advance equity and justice in the design, governance, and use of technology.

We write in support of the Office of Science and Technology Policy’s efforts to protect people’s fundamental rights and opportunities as powerful institutions continue to use data-driven technologies to shape key decisions about people’s lives. These technologies, which include biometric technologies, often mirror and exacerbate historical racial and economic disparities in housing, employment, public benefits, the criminal legal system, and other areas of opportunity and wellbeing.

Across these areas, technologies are often used to make political decisions that can substantially affect people’s material conditions, especially in the absence of careful attention and government regulation. Over the past few decades, these technologies have undermined existing legal protections, including longstanding civil rights protections that have not kept pace with technology.

1. Biometrics are just one type of technology that are shaping people’s rights and opportunities and deepening existing racial, economic, and other social disparities.

Biometric technologies are among the latest in a long line of technologies that purport to measure people’s attributes and predict future behavior, often with serious consequences. For decades, both governments and the private sector have used digital technologies to help determine people’s access to social resources, such as housing and government benefits; economic opportunities, including jobs, credit, and education; and basic autonomy and wellbeing, including healthcare and public safety.

Today’s terminology places all of these technologies in the frame of “AI,” which confers more complexity and novelty than the issues often deserve. The most consequential technologies that are affecting people’s rights and shaping their opportunities today are often not new — and the problems that these technologies exacerbate, such as racial, gender, disability, and other forms of discrimination and inequities, are longstanding. For instance, statistical risk assessment tools that states are adopting today for pretrial release decisions date back to at least the 1990s. Consumer credit scoring algorithms, like FICO, emerged in the 1980s.

The same concerns that animate today’s call for a new Bill of Rights for an “AI-powered world” were raised during the Obama administration, under the frame of “big data,” which was the fashionable term at the time. While biometric technologies, particularly recent applications of face recognition, may be an attractive starting point, this administration must consider the impact of a broader scope of technologies and data practices, most of which are not biometrics or AI.

Consider a job applicant who is applying online for an hourly job. Many large employers in the U.S. now use multipurpose “applicant tracking systems” to manage their hiring processes, which often include background checks and a variety of online skills and personality screening tests. Some personality tests used in this context purport to assess people’s trustworthiness and other traits, but in ways that reflect racist and ableist assumptions and anti-union motivations. While these aren’t complex technologies, they are among the ones that regulators like the Equal Employment Opportunity Commission should center in any examination of hiring discrimination and technology. To be sure, some vendors, like HireVue, have sought to introduce face or voice analysis technologies into employers’ interviewing processes, but the practical impact of these applications today remains quite limited.

Similarly in other areas, many well-entrenched technology and data practices continue to have adverse impacts on Americans’ everyday lives: the use of eviction and criminal records in tenant screening tools, increased digital tracking of families in the child welfare system and of workers in home care, law enforcement searches of people’s cellphones, and so on. Practices and systems like these have harmed people for decades — scrutiny cannot only be limited to emerging tools like biometrics or AI.

2. It’s inadequate to address the harms of technology by examining technology in isolation. It’s vital to consider the broader social, political, and historical context in which technology is used.

Technology tends to amplify structural power — and technology’s impact depends not only on its design, but also on the broader social, political, and historical context in which it is used. While work to assess the statistical validity of a technology may provide important technical guideposts, additional perspectives are needed to more fully evaluate the potential effects of technology in various social contexts.

As a case in point, researchers and government agencies have worked to assess racial and gender disparities in popular face recognition programs. These studies have been indispensable to understanding these programs’ flaws. But even a technically “perfect” face recognition system would still perpetuate many social harms, including the harms of increased surveillance. This is why, last year, over 40 civil society organizations called for an end to law enforcement’s use of face recognition. Due to the long history of racial discrimination and abuse by law enforcement in the United States, which continues to this day, the organizations concluded that “in the context of policing, face recognition is always dangerous—no matter its accuracy.”

In other cases, the use of a technology may benefit some people while at the same time harming others. For example, the use of face recognition to verify the identities of people applying for unemployment benefits may speed up the process for those who have easy access to smartphones and for whom the software works, while creating barriers for others. Technology can also shift and widen power imbalances, such as when landlords install face recognition to control access to their buildings. The problems here are not only about the technology’s accuracy or validity, but are largely tied to existing social inequities and harms that technology further amplifies.

For these reasons, policy debates about the merits of certain technologies need to be rooted in particular social contexts, not in a vacuum. To that end, Upturn, ACLU, the Leadership Conference on Civil and Human Rights, and a coalition of other organizations recently urged relevant federal agencies to step up their regulatory and enforcement activities to specifically address technology’s role in discrimination in housing, hiring, and financial services.

Too often, it’s difficult or impossible for researchers, advocates, investigative journalists, and communities to interrogate and challenge the use of technologies. While transparency alone will not mitigate the harms, it is an important baseline upon which people can begin to ask questions about how technologies are used and the potential ways they create or exacerbate inequities.

One way that technology shifts power is through opacity. While opacity is often attributed to the complex nature of new technologies, such as machine learning models, opacity is often also created or furthered through legal and policy choices that put corporate interests above people’s fundamental rights. 

For instance, claims of trade secrecy have prevented criminal defendants from scrutinizing evidence created by potentially flawed probabilistic DNA analysis software used by law enforcement. At least two courts have ordered disclosure of the software’s source code to uphold the constitutional rights of criminal defendants to confront the evidence against them. Such trade secrets claims have been made not only by private vendors like TrueAllele, but also by government agencies seeking to shield their decision-making tools from independent scrutiny. In a similar vein, private vendors and government agencies have used non-disclosure agreements to hide the mere fact that certain technologies are in use.

At the state level, one step forward has been Illinois’s Biometric Information Privacy Act (BIPA), which requires companies to provide disclosure and obtain individual consent before collecting and using biometric information, and prohibits companies from selling or further sharing biometric data without consent. While notice-and-consent can place undue burdens on individuals and may be insufficient to address systemic harms, BIPA gave rise to a number of high-profile class action lawsuits and settlements seeking to control how biometrics are used.

4. Regulators and enforcement agencies must actively measure, audit, and address systemic discrimination where technologies are used, and consider non-technological alternatives.

Inferential and other predictive technologies make probabilistic guesses and they inevitably make mistakes. They also often fail for more prosaic reasons, due to inequities in access to or familiarity with smartphones and other technological requirements. When these technologies are used to mediate high-stakes decisions, such as determining access to crucial government services and benefits, these failures are not only frustrating and time-consuming but in some cases life-threatening. Even when these technologies work, they can introduce friction and rigidity to processes that ultimately hinder people’s access to vital resources and opportunities. These barriers disproportionately harm people of color, poor people, disabled people, and others.

During the pandemic, as millions of workers sought unemployment benefits, many states began to adopt face recognition tools to verify people’s identities. But this created significant burdens for many who either did not have access to smartphones, or for whom the software failed to match their identity. Many were then required to wait on hold for hours to resolve issues and some — without alternate options or timely redress — ended up abandoning the process altogether in frustration, giving up on the benefits that they deserved to receive. Importantly, because of existing disparities across race, class, and geography in access to smartphones and broadband internet, these burdens too often fell on those who were most vulnerable and most in need of benefits.

In another context, the growing popularity of e-proctoring software — from K-12 classrooms to bar examinations — creates systems that often fail to verify the identities of Black students and other students of color, arbitrarily and unfairly flag some students for cheating, and set up rigid behavioral rules that punish students for getting up to use the bathroom or looking around the room. While these are problems for any student, such software can impose much worse effects on disabled students, “which can also exacerbate underlying anxiety and trauma.” Black students and other students of color, who are already more likely to face punishment in school, are especially vulnerable to long-lasting negative effects of increased monitoring.

Rarely are there alternatives that allow students or unemployed people to opt-out of these mainstream processes and avoid the coercive effects of technology. These are systemic harms that require systemic interventions, but it’s often difficult for individuals who encounter harms to show broader discriminatory patterns. It’s necessary for regulators and enforcement agencies to play a stronger and more active role to assess whether technologies are exacerbating existing inequities in key areas of justice and opportunity. One way to do this is by using demographic data to measure and audit systems for disparate impact. These are long-standing civil rights enforcement measures that can also be used to assess the impact of new technologies.

Conclusion

These are urgent issues that the Biden administration must address. In July 2021, Upturn wrote a letter to the Office of Science and Technology Policy (OSTP), together with 26 other groups, urging OSTP to work across the federal government to “identify how technology can drive racial inequities, and help agencies devise new policies, regulations, enforcement activities, and guidance that address these barriers.” Attached to the letter were three memos sent to federal agencies outlining concrete recommendations to address technology’s role in housing, hiring, and financial services discrimination. While some progress has been made at the agency level, much more remains to be done. OSTP must work to support the administration in developing a proactive and coordinated policy agenda to tackle these challenges.

Thank you for considering these comments. We welcome further conversations on these important issues. If you have any questions, please contact Emily Paul (Project Director, emily@upturn.org) and Harlan Yu (Executive Director, harlan@upturn.org).

Download
Upturn Authors


1

A 2019 investigation of mortgage lending data found, for example, that Black people applying for loans were 80% more likely to be denied than white applicants. See Martinez, Emmanuel and Lauren Kirchner. “The Secret Bias Hidden in Mortgage-Approval Algorithms.” The Markup. (Aug. 25, 2021). www.themarkup.org/denied/2021/08/25/the-secret-bias-hidden-in-mortgage-approval-algorithms.

1

In 2017, Amazon ended development of a machine learning tool to score job applicants after realizing that it lowered scores based on factors including the inclusion of the word “women’s” and attending all-women’s colleges. See Dastin, Jeffrey. “Amazon scraps secret AI recruiting tool that showed bias against women.” Reuters. Oct. 10, 2018. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G. An audit of another resume screening tool revealed that two of the factors identified by the model as predictive of good job performance were having the name Jared and playing high school lacrosse. See Gershgorn, Dave. “Companies are on the hook if their hiring algorithms are biased.” Quartz. Oct. 22, 2018. https://qz.com/1427621/companies-are-on-the-hook-if-their-hiring-algorithms-are-biased/.

1

McCormick, Erin. “What happened when a ‘wildly irrational’ algorithm made crucial healthcare decisions.” The Guardian. (July 2, 2021). https://amp.theguardian.com/us-news/2021/jul/02/algorithm-crucial-healthcare-decisions.

1

Robinson, David and Logan Koepke. “Stuck in a Pattern.” Upturn. (Aug. 31, 2016). https://www.upturn.org/work/stuck-in-a-pattern/.

1

For example, in 2017, Immigration and Customs Enforcement (ICE) quietly changed its risk assessment tool so that it no longer made any recommendations to release people awaiting deportation hearings. Jose L. Velesaca v. Chad Wolf et al. United States District Court for the Southern District of New York. 1:20-cv-01803. Feb 28, 2020. https://www.nyclu.org/en/cases/jose-l-velesaca-v-chad-wolf-et-al.

1

See, e.g., Solon Barocas and Andrew Selbst. “Big Data’s Disparate Impact.” 104 Calif. L. Rev. 671. (2016). https://www.californialawreview.org/print/2-big-data.

1

For example, COMPAS, a hotly contested tool that many states have adopted to inform pretrial release decisions, was first developed in 1998. VPRAI, another widely-used pretrial risk assessment tool, was first developed in 2003. See https://pretrialrisk.com/the-basics/common-prai/.

1

Hill, Adriene. “A brief history of the credit score.” Marketplace. (Apr. 22, 2014). https://www.marketplace.org/2014/04/22/brief-history-credit-score/.

1

See “Big Data: Seizing Opportunities, Preserving Values.” Executive Office of the President. (May 2014). https://obamawhitehouse.archives.gov/sites/default/files/docs/big_data_privacy_report_may_1_2014.pdf; “Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights.” Executive Office of the President. (May 2016). https://obamawhitehouse.archives.gov/sites/default/files/microsites/ostp/2016_0504_data_discrimination.pdf.

1

Rieke, Aaron; Urmila Janardan; Mingwei Hsu; and Natasha Duarte. “Essential Work.” Upturn. (July 6, 2021). https://www.upturn.org/work/essential-work/.

1

Knight, Will. “Job Screening Service Halts Facial Analysis of Applicants.” Wired. (Jan. 12, 2021). https://www.wired.com/story/job-screening-service-halts-facial-analysis-applicants/.

1

Public Hearing on B23-149, Fair Tenant Screening Act of 2019, B23-498, Intersectional Discrimination Protection Amendment Act of 2019, B23-195, Michael A. Stoops Anti-Discrimination Amendment Act of 2019: Council of the District of Columbia Committee on Government Operations. October 27, 2020. (Testimony of Natasha Duarte and Tinuola Dada). https://www.upturn.org/static/files/2020-10-27-testimony-DC-fair-tenant-screening-act.pdf.

1

Roberts, Dorothy. “Child protection as surveillance of African American families.” Journal of Social Welfare and Family Law. Vol 36. (2014). https://www.tandfonline.com/doi/abs/10.1080/09649069.2014.967991.

1

Mateescu, Alexandra. “Electronic Visit Verification: The Weight of Surveillance and the Fracturing of Care.” Data & Society. (Nov. 16, 2021). https://datasociety.net/library/electronic-visit-verification-the-weight-of-surveillance-and-the-fracturing-of-care.

1

Law enforcement uses mobile device forensic tools (MDFTs) to extract and search data on people’s phones. The software includes face recognition capabilities for searching photos stored on the phone. Koepke, Logan; Emma Weil; Urmila Janardan; Tinuola Dada; and Harlan Yu. “Mass Extraction.” Upturn. (Oct. 20, 2020). https://www.upturn.org/work/mass-extraction/.

1

Buolamwini and Timnit Gebru. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” (2018). http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf.; Grother, Patrick; Mei Ngan; Kayee Hanaoka. “Face Recognition Vendor Test (FRVT) Part 3: Demographic Effects.” NISTIR 8280, National Inst. of Standards and Technology. (December 2019). https://nvlpubs.nist.gov/nistpubs/ir/2019/NIST.IR.8280.pdf.

1

Garvie, Clare; Alvaro Bedoya; and Jonathan Frankle. “The Perpetual Line-Up: Unregulated Police Face Recognition in America.” Georgetown Law Center on Privacy & Technology. (Oct. 18, 2016). https://www.perpetuallineup.org/.

1

New America’s Open Technology Institute, The Leadership Conference on Civil and Human Rights, and Upturn, et al. “Civil Rights Concerns Regarding Law Enforcement Use of Face Recognition Technology.” (June 3, 2021). https://www.newamerica.org/oti/press-releases/civil-society-coalition-releases-statement-of-concerns-regarding-law-enforcement-use-of-face-recognition-technology/.

1

Id.

1

Kenney, Andrew “’I’m shocked that they need to have a smartphone’: System for unemployment benefits exposes digital divide.” USA Today. (May 2, 2021). https://www.usatoday.com/story/tech/news/2021/05/02/unemployment-benefits-system-leaving-people-behind/4915248001/.

1

Landlords are increasingly using technology to manage their interactions with current and potential tenants, e.g. https://antievictionmappingproject.github.io/landlordtech/. One high-profile example of tenant organizing to resist the use of biometrics happened in New York City in 2019, when the owner of a large rent-stabilized building attempted to install a face recognition system to control access to the building. See Durkin, Erin. “New York tenants fight as landlords embrace facial recognition cameras.” The Guardian. (May 30, 2019) https://www.theguardian.com/cities/2019/may/29/new-york-facial-recognition-cameras-apartment-complex.

1

“Addressing Technology’s Role in Housing Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technologys-role-in-hiring.

1

“Addressing Technology’s Role in Hiring Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technology-housing.

1

“Addressing Technology’s Role in Financial Services Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technologys-financial.

1

Wexler, Rebecca. “Life, Liberty, and Trade Secrets: Intellectual Property in the Criminal Justice System.” 70 Stanford Law Review 1343. (2018), 1368. https://www.stanfordlawreview.org/print/article/life-liberty-and-trade-secrets/.

1

In February 2021, a New Jersey appeals court ruled that trade secrets can’t be used to limit defense access to source code and other documentation for the DNA software used to analyze evidence in State v. Pickett (https://www.njcourts.gov/attorneys/assets/opinions/appellate/published/a4207-19.pdf?c=0qT). This was an important win in the ongoing fight to stop companies’ intellectual property rights from infringing on defendants’ constitutional rights. Upturn and Harvard’s Cyberlaw Clinic submitted an amicus brief in the case, arguing for the need for independent and adversarial review of the software (https://www.upturn.org/work/amicus-brief-in-new-jersey-v-pickett/). That same month, the U.S. District Court for the Western District of Pennsylvania ordered disclosure of source code for the same software in U.S. v. Ellis (https://storage.courtlistener.com/recap/gov.uscourts.pawd.262237/gov.uscourts.pawd.262237.138.0_1.pdf).

1

New York City’s Office of the Chief Medical Examiner refused to share source code with defendants claiming that the software was “proprietary and copyrighted.” A judge later ordered OCME to disclose the source code and an expert reviewer identified issues in the source code that could affect the software’s assessment of the likelihood that a given person’s DNA is in the mixture. Kirchner, Lauren. “Thousands of Criminal Cases in New York Relied on Disputed DNA Testing Techniques.” (Sept. 4, 2017). ProPublica and New York Times. https://www.propublica.org/article/thousands-of-criminal-cases-in-new-york-relied-on-disputed-dna-testing-techniques.

1

Wessler, Nathan Freed. “Documents in ACLU Case Reveal More Detail on FBI Attempt to Cover Up Stingray Technology.” American Civil Liberties Union. (Sept. 24, 2014). https://www.aclu.org/blog/documents-aclu-case-reveal-more-detail-fbi-attempt-cover-stingray-technology.

1

Biometric Information Privacy Act, 740 ILCS 14 (2008). https://www.ilga.gov/legislation/ilcs/ilcs3.asp?ActID=3004&ChapterID=57

1

One study estimated the time it would take people to read the privacy policies of all the sites they visit at 201 hours per year. Cranor, Lorrie Faith and Aleecia M. McDonald. “The Cost of Reading Privacy Policies.” I/S: A Journal of Law and Policy for the Information Society. 2008 Privacy Year in Review issue. http://www.is-journal.org/ (Accessed at https://lorrie.cranor.org/pubs/readingPolicyCost-authorDraft.pdf). On inadequacies of notice-and-consent see, e.g. Nehf, James P., “The Failure of ‘Notice and Consent’ as Effective Consumer Policy.” (August 21, 2019). https://ssrn.com/abstract=3440816.

1

See ACLU v. Clearview AI. https://www.aclu.org/legal-document/aclu-v-clearview-ai-complaint; and Patel v. Facebook. https://law.justia.com/cases/federal/appellate-courts/ca9/18-15982/18-15982-2019-08-08.html. Facebook settled Patel for $650 million in February 2021. https://www.courthousenews.com/wp-content/uploads/2021/02/facebook-settle-approval-2.26.21.pdf.

1

These mistakes can arise from false matches or non-matches when biometrics are used for identity verification. Other mistakes stem from false assumptions, or a fundamental lack of scientific grounding, when technology attempts to infer demographic traits, behavior, emotional state, or intent.

1

Lyons, Kim. “Facial recognition software used to verify unemployment recipients reportedly doesn’t work well.” The Verge. (June 19, 2021). https://www.theverge.com/2021/6/19/22541427/facial-recognition-software-verify-unemployment-benefits-id-me.

1

One woman in Colorado tried and failed 60 times to take a suitable picture on her older smartphone to verify her identity. See Kenney, Andrew. “’I’m shocked that they need to have a smartphone’: System for unemployment benefits exposes digital divide.” USA Today. (May 2, 2021). https://www.usatoday.com/story/tech/news/2021/05/02/unemployment-benefits-system-leaving-people-behind/4915248001/.

1

One person applying for unemployment benefits in California spent months submitting paperwork and calling a hotline before being asked to use face recognition to verify their identity. After multiple attempts, the system couldn’t match their face and they eventually stopped trying to access unemployment benefits. See Sato, Mia. “The pandemic is testing the limits of face recognition.” MIT Technology Review. (Sept. 28, 2021). https://www.technologyreview.com/2021/09/28/1036279/pandemic-unemployment-government-face-recognition/.

1

White Americans are more likely to have a computer and broadband internet than Black or Hispanic Americans. See Atske, Sara and Andrew Perrin. “Home broadband adoption, computer ownership vary by race, ethnicity in the U.S.” Pew Research Center. (July 16, 2021). https://www.pewresearch.org/fact-tank/2021/07/16/home-broadband-adoption-computer-ownership-vary-by-race-ethnicity-in-the-u-s/.

1

Kelley, Jason. “Bar Applicants Deserve Better than a Remotely Proctored ‘Barpocalypse.’” Electronic Frontier Foundation. (Oct. 9, 2020). https://www.eff.org/deeplinks/2020/10/bar-applicants-deserve-better-proctored-barpocalypse

1

Johnson, Kari. “ExamSoft’s remote bar exam sparks privacy and facial recognition concerns.” VentureBeat. (Sept. 29, 2020). https://venturebeat.com/2020/09/29/examsofts-remote-bar-exam-sparks-privacy-and-facial-recognition-concerns/.

1

Brown, Lydia X. Z. “How Automated Test Proctoring Software Discriminates Against Disabled Students.” Center for Democracy & Technology. (November 16, 2020) https://cdt.org/insights/how-automated-test-proctoring-software-discriminates-against-disabled-students/.

1

Id.

1

See, e.g. Del Toro, Juan and Ming-Te Wang. “The Roles of Suspensions for Minor Infractions and School Climate in Predicting Academic Performance Among Adolescents.” American Psychologist. (Oct 2021). https://www.apa.org/news/press/releases/2021/10/black-students-harsh-discipline.

1

“Centering Civil Rights in Artificial Intelligence and Technology Policy.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technologys-role-in.

1

“Addressing Technology’s Role in Housing Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technology-housing/.

1

“Addressing Technology’s Role in Hiring Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technologys-role-in-hiring.

1

“Addressing Technology’s Role in Financial Services Discrimination.” (July 13, 2021). https://www.upturn.org/work/proposals-for-the-biden-administration-to-address-technologys-financial.