Across the Field
Policymakers and advocates need stronger methods to challenge automated decisions.
We seek to lay a strong foundation to challenge automated decisions across our issues. We look for structural opportunities to improve how governments and companies measure racial disparities, and to support advocates by expanding both legal protections and access to the information necessary to produce independent analyses. This work also includes detailed investigations into specific platforms, such as Facebook’s role in driving discrimination using targeted online advertising.
Emily Black, Logan Koepke, Pauline Kim, Solon Barocas, and Mingwei Hsu
Our paper on how entities that use algorithmic systems in traditional civil rights domains like housing, employment, and credit should have a duty to search for and implement less discriminatory algorithms (LDAs).Read more
Latest work in this issue areaAll work in this issue area
We wrote comments in response to the Office of Management and Budget’s draft memorandum, Advancing Governance, Innovation, and Risk Management for Agency Use of Artificial Intelligence (AI).
As the Biden-Harris administration considers the contents of an Executive Order on artificial intelligence, the undersigned civil rights, technology, policy, and research organizations call on the administration to continue centering civil rights protections.
We provided comments to the CFPB urging them to protect consumers from the harms of data brokers.
Alongside 40 other civil rights and technology advocacy organizations, Upturn called on the Federal Trade Commission to develop specific, concrete civil rights protections in the Commission’s ongoing Commercial Surveillance and Data Security Rulemaking.
Selected press and events
WIRED covers Upturn’s research on how Facebook’s ad delivery system may perpetuate bias.
Coverage from The Verge on Upturn’s research into how Facebook’s ad system can skew delivery outcomes.
The Economist covers Upturn’s research on Facebook’s seemingly discriminatory ad system.
“Facebook’s algorithms, which match marketing messages with viewers, leans on stereotypes when it comes to housing and jobs, according to [Upturn’s empirical work].”