We drive policy outcomes and spark debate through reports, scholarly articles, regulatory comments, direct advocacy efforts together with coalition allies, articles and op-eds, and participation in events including public panels, conferences, and workshops. Here's a selection of our recent work.
In the Harvard Business Review, Miranda explains what we mean when we talk about “hiring algorithms” and why predictive hiring technology is far more likely to erode equity than it is to promote it.
In The Atlantic, we argue that digital platforms—which deliver exponentially more ads than their newsprint predecessors—are making core civil-rights laws increasingly challenging to enforce.
* The Leadership Conference on Civil and Human Rights
Our empirical research showed that Facebook itself can skew the delivery of job and housing ads along race and gender lines, even when advertisers target broad audiences.
* Northeastern University, ** University of Southern California
Miranda joined the Washington Post Live for “Transformers: Artificial Intelligence” along with the NAACP-LDF’s Sherrilyn Ifill and Kelly Trindel of Pymetrics to discuss challenges presented by AI in hiring. VIDEO
We filed comments with the Judicial Council of California on two of its proposed new court rules. We argued that the proposed rules on how courts use pretrial risk assessment tools need significant modifications in order to be constitutionally defensible and to protect civil rights.
Without active measures to mitigate them, bias will arise in predictive hiring tools by default. This report describes popular tools that many employers currently use, explores how these tools affect equity throughout the entire hiring process, and offers reflections and recommendations on where we go from here.
We filed a legal brief arguing that Section 230 should not fully immunize Facebook’s Ad Platform from the Fair Housing Act. We describe how Facebook itself, independently of its advertisers, participates in the targeting and delivery of housing advertisements based on protected status.
We offered comments to the Federal Trade Commission on the implications of algorithmic decision tools used in consumer advertising and marketing campaigns.
In the first rigorous, independent evaluation of Facebook’s new ad transparency plans, we urge the company to improve its ad transparency tools to enable meaningful public scrutiny.
We wrote a letter to Axon’s AI Ethics Board to express serious concerns about the direction of Axon’s product development, including the possible integration of real-time face recognition with body-worn camera systems. We were joined on this letter by 41 other civil rights, racial justice, and community organizations.
We looked at police shootings from 2017 to see whether video was released to the public, after how long, and under what circumstances — and found that too often, these videos stay hidden.
Automated decisions are increasingly part of everyday life, but how can the public scrutinize, understand, and govern them? This Upturn and Omidyar Network report maps out the landscape, providing practical examples and a framework to think about what has worked.
Bail reform is rapidly underway. But at the same moment that jurisdictions work to reduce the true risks of pretrial release through reform policies, jurisdictions across the country are also adopting statistical tools that will blindly predict such risks remain as high as ever. This forthcoming article charts how jurisdictions can avoid making costly errors in their adoption of pretrial risk assessment tools.
Miranda moderates a discussion with panelists from Facebook and the Federal Election Commission about voting technology, online election advertising, and internet freedom. VIDEO
This framing paper, prepared for the NetGain Partnership, explores how automated decisions are shaping the lives of vulnerable people and groups, and offers suggestins and direction for interested funders and the broader social sector.
The encryption debate is generally framed as a struggle between civil liberties and national security. We partnered with Consumer Reports to shed light on why encryption is critical for consumers’ safety and well-being.
Harlan and Malkia Cyril of the Center for Media Justice take a hard look at the impact of police body-worn cameras. “The sad reality is that these cameras mirror the power and the interests of the police, not those of the communities they are sworn to serve,” they write.
Together with the Leadership Conference, Upturn releases the latest version of our scorecard that evaluates the police body-worn camera policies in 75 major U.S. cities. It continues to show a nationwide failure to protect the civil rights and privacy of surveilled communities.
Miranda joins DC Legal Hackers and the Lab@DC to discuss trends in body worn camera technology and policy, pros and cons to various implementation models, and privacy risks and costs.
Today, most major police departments that use body-worn cameras allow officers unrestricted footage review. This report explains why police departments must carefully limit officers’ review of body-worn camera footage, and calls for “clean reporting” to be adopted by all police departments.
Upturn — together with the DC Metropolitan Police and The Lab @ DC — hosts two community conversations to discuss the results of a recent DC body-worn camera study, which showed that the District’s camera program had no statistically significant effect on officer behavior.
This article draws lessons primarily from the domain of criminal justice, to illustrate three structural challenges that can arise whenever law or public policy contemplates adopting predictive analytics as a tool. It then offers some ideas for solutions.
After St. Louis chose to accept a year of free body-worn cameras from Axon, we argue that the city’s police department needed to significantly strengthen civil rights protections in its BWC policies — particularly around when officers can review footage.
At an event hosted by Dialogue on Diversity, Aaron talks about the nexus of commercial data collection and civil rights, including racially-targeted and predatory internet advertising.
Gerrymandering isn’t just a math problem — it’s a policy fight, legal quagmire, mapping challenge, and statistical puzzle, all wrapped into one. This article explores the math and computer science concepts behind new efforts to ensure fair redistricting processes, and a Supreme Court case that could change how we measure bias in voting districts.
Newly released data shows that almost 400,000 people are on Chicago’s “heat list.” Of that group, almost 290,000 have scores that the CPD says will lead to more scrutiny. Our analysis also shows that the most important factor in a person’s score was their age.
With The Leadership Conference on Civil and Human Rights and Americans for Financial Reform, we explore the risks and benefits of new types of credit data for historically disadvantaged groups. The comments spotlight data that is most predictive of likelihood and ability to repay, and least likely to raise fair lending concerns.
Ever since we launched, we’ve worked to make sure that technology serves the dignity and well-being of everyone it touches. That’s why Upturn is excited to be joining the Partnership on AI to Benefit People and Society, alongside leading companies and social sector organizations.
The Charles Koch Institute, The Constitution Project, and Upturn jointly co-host an evening panel to discuss pressing policy issues related to police body-worn cameras, including public access to footage and the potential use of face recognition.
At the NYU School of Law, Aaron talks about how and when credit scores can be explained, and the reasons for startling disparities in credit scores among different protected groups.
Upturn files an objection to the NYPD’s proposed body-worn camera policy, together with the Leadership Conference, the Center for Media Justice, Color Of Change, and other groups.
At the Hometown Summit, Miranda discusses how advocacy around body-worn camera policies might provide a model for local citizen engagement at a conference for city officials, community organizers and artists working to rethink democracy and empower citizens to shape the future of their cities.
In the New York Daily News, Miranda and Harlan criticize the NYPD for ignoring public opinion while developing their body-worn camera policy. The NYPD’s proposed policies “risk turning these cameras from tools of accountability into something else entirely.”
Over at Motherboard, Harlan explains why Axon’s offer of free body-worn cameras for every cop in America is dangerous. It creates a perverse incentive for departments to rush to adopt camera systems without thinking through the hard policy challenges.
The Harvard Human Rights Journal’s annual symposium explores the disproportionate impact of law enforcement surveillance on minority communities both at home and abroad.
Could a pretrial algorithm simultaneously reduce the number of individuals incarcerated pre-trial, failures-to-appear, and the proportion of minorities detained pre-trial? A new study suggests that with the right controls and policies, jurisdictions might be able to do so.
The 2017 Princeton–Fung Global Forum in Berlin asks the question: “Can Liberty Survive the Digital Age?” Harlan joins a distinguished line-up of speakers including Vint Cerf, one of the “fathers of the Internet.” VIDEO
Harlan testifies at a hearing held by the Philadelphia City Council Committee on Public Safety on the Philadelphia Police Department’s current body-worn camera policy, which is not meeting national best practices.
Drawing on computer science expertise, we propose a new governance strategy, using cryptography to prove that a decision is rule-bound and correct, even when the decision comes from a “black box” that is secret or is too complex for direct human inspection.
In Charleston, WV, Aaron explores the risks and benefits of new types of credit data for historically disadvantaged groups.
At the Roosevelt Insitute in New York, Aaron presents to a small group about ways that Upturn has helped drive change to technology companies’ practices.
The Fourth Amendment Advisory Committee holds a briefing for Congress on “the profound impacts of the government’s immense surveillance practices and the different ways it affects different people.” VIDEO
In a piece published in Slate, Logan argues that popular analysis of predictive policing systems too often focuses on their predictions about the future, and less about the historical data upon which they rely.
Facebook can and should do more to protect its users from discrimination — especially in civil rights areas like housing, credit, and employment.
In a report for the Open Society Foundations, we review the different types of brokerage and profiling products sold by data brokers, survey the relevant legal landscape, and recommend an impact-driven, bottom-up approach to further investigation of data-driven profiling by data brokers.
David describes Upturn’s work — and the lessons learned from it — for a campus audience at Vanderbilt.
David interviews data scientist and activist Cathy O’Neil about her new book.
Miranda moderates a discussion on Capitol Hill with panelists from Google and the Open Technology Institute about government surveillance, corporate transparency reporting, and other ways technology companies have adapted to respond to government demands for user data in the post-Snowden world. VIDEO
What’s at stake for civil society groups is not only their operational efficiency, but ultimately their effectiveness. In order to succeed in reshaping society, civil society groups must be able to responsibly use the most powerful tools at their disposal—and must also understand how public and private institutions use those powerful tools.
This project maps the ways that data at scale may pose risks to philanthropic priorities and beneficiaries, identifies key questions that funders and grantees should consider before undertaking data-intensive work, and offers recommendations for funders to address emergent data ethics issues.
“Algorithmic accountability” is critical, but it presumes we know what values we’re trying to protect. This article argues that we need to critically think about and publicly debate the social values we’re imbuing into machines, and consider what to do when people fundamentally disagree about what those values ought to be.
Presenting to a group of data scientists in Bend, OR, Aaron discusses the ways that machine learning systems can reinforce bias across the criminal justice system.
As guest faculty at an American Bar Association event, Aaron presents on the impact of technology on consumer financial services law, and discusses online platforms’ recent ban on payday loan ads.
We find that at least 20 of the nation’s 50 largest police forces have used a predictive policing system, with at least an additional 11 actively exploring options to do so. Vendors shield the technology in secrecy, and informed public debate is rare. Early research findings suggest that these systems may not actually make people safer.
Law enforcement’s blind faith in a tool that doesn’t always work — a tool that can easily finger the wrong person, with terrible results — provides the central tension for that blockbuster film, and a vital lesson for our present.
Over in Newsweek, Miranda looks at the history of Google’s policy decisions about border designations in Google maps.
How much of the public concern and reporting about “social media scores” has come untethered from reality.
Together with the Leadership Conference, Upturn releases a scorecard that evaluates the police body-worn camera policies in 50 major U.S. cities. It shows a nationwide failure to protect the civil rights and privacy of surveilled communities.
A visit to Oakland sparks David to reflect on Airbnb’s wider impact.
The FBI is proposing to exempt its entire NGI database from the basic regulations and requirements of the Privacy Act of 1974 — the very law designed to govern how the FBI handles this type of sensitive information. In this piece, Logan explains why that move is troubling for civil rights.
The list aims to predict who will be involved in future shootings. Its growing role in Chicago policing is a taste of what’s ahead.
Harlan joins Julie Brill, former Commissioner of the Federal Trade Commission, and Alvaro Bedoya, executive director of the Center on Privacy & Technology at Georgetown Law, in a discussion about civil rights and data science. VIDEO
After Google’s announcement that it will ban ads for payday loans, Aaron explains why this was a good call.
David takes part in a panel as part of Philly Tech Week.
At the inaugural Color of Surveillance conference at Georgetown Law examines how the U.S. government has monitored African American communities from the colonial era to the present day. In his talk, Harlan discusses how police departments nationwide are using body-worn cameras to surveil communities of color. VIDEO
A popular new DNA analysis tool, TrueAllele, claims to help law enforcement solve crimes by analyzing DNA mixtures or degraded samples. But its creator argues that the program’s source code is a trade secret. In this piece, Logan argues that sometimes, defendants should be able to see the source code of software that helps to convict them.
Harlan joins panelists from the National Security Agency, the ACLU, the Brennan Center for Justice, and Cardozo Law, to discuss the implications of the NSA’s overseas surveillance activities under EO 12333. VIDEO
The panel, hosted by the New America’s Open Technology Institute, explores the responsibilities that broadband Internet providers ought to have to protect its customers’ privacy, as well as the FCC’s regulatory role. VIDEO
“What can your ISP see when you go online anyway?” Aaron offers some concise answers.
A technical assessment of the present and potential future monitoring capabilities available to internet service providers.
A panel discussion about new privacy rules that the FCC sought to impose on broadband providers. VIDEO
The Center for Media Justice and the Million Hoodies Movement for Justice hosts its monthly online salon, featuring Hamid Khan from the Stop LAPD Spying Coalition and Harlan Yu from Upturn. VIDEO
On-demand labor is weakening our sense of fairness. Tom Slee wants to wake us up.
Aaron discusses the promise of pitfalls of new online lending offerings at the Financial Services Conference hosted by the Consumer Federation of America.
New America’s Open Technology Institute hosts a panel discussion on the current state of body-worn camera deployments, including their potential impact at our nation’s borders. VIDEO
Upturn, Data & Society and the Leadership Conference host a major conference that explores the intersection of technology and criminal justice for law enforcement officers, government agencies, technology companies, civil rights leaders, technologists, and researchers.
Before the FTC, Aaron describes harm arising from online payday lead generation practices, and explains how many lead generators evade state laws.
We explain how online lead generation works, describe the risks and legal complexities specific to lead generation for online payday loans, document the widespread use of search ads by payday lead generators, and recommend interventions.
Upturn coordinated the development of a shared set of civil rights principles for body-worn cameras. The principles were endorsed by a major coalition of 34 local and national organizations, including the NAACP, National Council of La Raza, National Urban League, Center for Media Justice, ACLU, and others.
At Georgetown Law, Aaron talks about recent innovations in credit scoring, and how to evaluate risks and benefits of new data usage for vulnerable communities.
A panel discussion at the first-ever interdisciplinary conference on “Fairness, Accountability, and Transparency in Machine Learning” (FATML).
A “missing manual” for policy professionals seeking to better understand technology’s impact on financial underwriting and marketing.
How and where, exactly, does big data become a civil rights issue? This report begins to answer that question, highlighting key instances where big data and civil rights intersect.
Users in China can’t freely explore the Internet because of the regime’s “Great Firewall.” But special software tools—when they work—can help users around those barriers. We proposed a new approach to developing circumvention tools, a strategy called “collateral freedom.”