Across the Field
Policymakers and advocates need stronger methods to challenge automated decisions.
We seek to lay a strong foundation to challenge automated decisions across our issues. We look for structural opportunities to improve how governments and companies measure racial disparities, and to support advocates by expanding both legal protections and access to the information necessary to produce independent analyses. This work also includes detailed investigations into specific platforms, such as Facebook’s role in driving discrimination using targeted online advertising.
Harlan Yu, Aaron Rieke, and Natasha Duarte
We sent a letter urging the White House Office of Science & Technology Policy to fully incorporate the Biden administration’s commitment to racial equity into its AI and technology priorities.Read more
Latest work in this issue areaAll work in this issue area
We submitted comments in response to the Office of Science and Technology Policy’s request for information on public and private sector uses of biometric technologies.
This amicus brief urges the Fourth Circuit to preserve critical and longstanding obligations under the Fair Credit Reporting Act (FCRA) that require consumer reporting agencies to ensure the accuracy of records used to make decisions about people’s access to housing, employment, credit, and other basic needs.
We filed a legal brief arguing that Section 230 should not fully immunize Facebook’s Ad Platform from liability under a California antidiscrimination law. We describe how Facebook itself, independently of its advertisers, participates in the targeting and delivery of insurance ads based on gender and age.
This brief argues that the Computer Fraud and Abuse Act should not criminalize violations of computer use policies, like terms of service.
Selected press and events
WIRED covers Upturn’s research on how Facebook’s ad delivery system may perpetuate bias.
Coverage from The Verge on Upturn’s research into how Facebook’s ad system can skew delivery outcomes.
The Economist covers Upturn’s research on Facebook’s seemingly discriminatory ad system.
“Facebook’s algorithms, which match marketing messages with viewers, leans on stereotypes when it comes to housing and jobs, according to [Upturn’s empirical work].”