We drive policy outcomes and spark debate through reports, scholarly articles, regulatory comments, direct advocacy efforts together with coalition allies, articles and op-eds, and participation in events including public panels, conferences, and workshops. Here's a selection of our recent work.
In the Harvard Business Review, Miranda explains what we mean when we talk about “hiring algorithms” and why predictive hiring technology is far more likely to erode equity than it is to promote it.
In The Atlantic, we argue that digital platforms—which deliver exponentially more ads than their newsprint predecessors—are making core civil-rights laws increasingly challenging to enforce.
* The Leadership Conference on Civil and Human Rights
We looked at police shootings from 2017 to see whether video was released to the public, after how long, and under what circumstances — and found that too often, these videos stay hidden.
Harlan and Malkia Cyril of the Center for Media Justice take a hard look at the impact of police body-worn cameras. “The sad reality is that these cameras mirror the power and the interests of the police, not those of the communities they are sworn to serve,” they write.
After St. Louis chose to accept a year of free body-worn cameras from Axon, we argue that the city’s police department needed to significantly strengthen civil rights protections in its BWC policies — particularly around when officers can review footage.
Gerrymandering isn’t just a math problem — it’s a policy fight, legal quagmire, mapping challenge, and statistical puzzle, all wrapped into one. This article explores the math and computer science concepts behind new efforts to ensure fair redistricting processes, and a Supreme Court case that could change how we measure bias in voting districts.
Newly released data shows that almost 400,000 people are on Chicago’s “heat list.” Of that group, almost 290,000 have scores that the CPD says will lead to more scrutiny. Our analysis also shows that the most important factor in a person’s score was their age.
Ever since we launched, we’ve worked to make sure that technology serves the dignity and well-being of everyone it touches. That’s why Upturn is excited to be joining the Partnership on AI to Benefit People and Society, alongside leading companies and social sector organizations.
In the New York Daily News, Miranda and Harlan criticize the NYPD for ignoring public opinion while developing their body-worn camera policy. The NYPD’s proposed policies “risk turning these cameras from tools of accountability into something else entirely.”
Over at Motherboard, Harlan explains why Axon’s offer of free body-worn cameras for every cop in America is dangerous. It creates a perverse incentive for departments to rush to adopt camera systems without thinking through the hard policy challenges.
Could a pretrial algorithm simultaneously reduce the number of individuals incarcerated pre-trial, failures-to-appear, and the proportion of minorities detained pre-trial? A new study suggests that with the right controls and policies, jurisdictions might be able to do so.
In a piece published in Slate, Logan argues that popular analysis of predictive policing systems too often focuses on their predictions about the future, and less about the historical data upon which they rely.
Facebook can and should do more to protect its users from discrimination — especially in civil rights areas like housing, credit, and employment.
What’s at stake for civil society groups is not only their operational efficiency, but ultimately their effectiveness. In order to succeed in reshaping society, civil society groups must be able to responsibly use the most powerful tools at their disposal—and must also understand how public and private institutions use those powerful tools.
“Algorithmic accountability” is critical, but it presumes we know what values we’re trying to protect. This article argues that we need to critically think about and publicly debate the social values we’re imbuing into machines, and consider what to do when people fundamentally disagree about what those values ought to be.
Law enforcement’s blind faith in a tool that doesn’t always work — a tool that can easily finger the wrong person, with terrible results — provides the central tension for that blockbuster film, and a vital lesson for our present.
Over in Newsweek, Miranda looks at the history of Google’s policy decisions about border designations in Google maps.
How much of the public concern and reporting about “social media scores” has come untethered from reality.
A visit to Oakland sparks David to reflect on Airbnb’s wider impact.
The FBI is proposing to exempt its entire NGI database from the basic regulations and requirements of the Privacy Act of 1974 — the very law designed to govern how the FBI handles this type of sensitive information. In this piece, Logan explains why that move is troubling for civil rights.
The list aims to predict who will be involved in future shootings. Its growing role in Chicago policing is a taste of what’s ahead.
After Google’s announcement that it will ban ads for payday loans, Aaron explains why this was a good call.
A popular new DNA analysis tool, TrueAllele, claims to help law enforcement solve crimes by analyzing DNA mixtures or degraded samples. But its creator argues that the program’s source code is a trade secret. In this piece, Logan argues that sometimes, defendants should be able to see the source code of software that helps to convict them.
“What can your ISP see when you go online anyway?” Aaron offers some concise answers.