On July 1, San Francisco District Attorney George Gascón will launch a new artificial intelligence tool meant to eradicate potential racial bias in prosecutors’ charging decisions via a “race-blind charging system.”
The first-of-its-kind algorithmic tool, created by the Stanford Computational Policy Lab, will also be offered free to any other prosecutor’s offices that wish to take part.
“Lady justice is depicted wearing a blindfold to signify impartiality of the law, but it is blindingly clear that the criminal justice system remains biased when it comes to race,” said District Attorney George Gascón. “This technology will reduce the threat that implicit bias poses to the purity of decisions which have serious ramifications for the accused, and that will help make our system of justice more fair and just.”
In recent years, implicit–or unconscious–bias has become more widely acknowledged as an issue that tangibly impacts policing and all other critical stages of the U.S. criminal justice system.
Increasingly, police officials, prosecutors, and public defenders are implementing implicit bias training within their offices, with the hope of reducing racial inequity within the justice system.
San Francisco’s plan takes it a step farther, by bringing in tech that, with luck, will make it far more difficult for prosecutors’ to make decisions based on those subconscious biases.
“DA Gascón asked a simple question to kick off the project—if his attorneys didn’t know the race of involved individuals when deciding whether to charge a case, would that lead to more equitable decisions?” Standford Computational Policy Lab said. “We’re aiming to answer this question with a new, data-driven blind-charging strategy. First, we built a lightweight web platform to replace the DA’s existing paper-based case review process.”
Stanford’s platform will allow for automatic identification of racial information, and automatic redaction of that information from crime reports and even “freely written crime narratives.” This includes individuals’ names, hair and eye colors, and home addresses if there’s a potential that prosecutors could use that information to guess at a person’s race.
With this redacted information, SF prosecutors will make a preliminary charging decision. After that decision is recorded, prosecutors will have access to the full, unaltered crime information, body camera footage, and other “non-race blind information.” In the event that a prosecutor drops charges or adds new charges, that prosecutor will have to document what additional evidence led to the changed charges. The DA’s office will collect and analyze the resulting data “to identify the volume and types of cases where charging decisions changed from phase 1 to phase 2 in order to refine the tool and to take steps to further remove the potential for implicit bias to enter our charging decisions.”
Image by Stanford Computational Policy Lab – a hypothetical example of how the blind-charging platform will redact identifying details from crime reports.