© 2024 KALW 91.7 FM Bay Area
KALW Public Media / 91.7 FM Bay Area
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
Crosscurrents

New Software Takes Race Out Of The Picture. Does It Make Prosecution More Fair?

Josh Edelson
/
AP Photos
San Francisco District Attorney George Gascón"

African Americans are disproportionately arrested and prosecuted in San Francisco. District Attorney George Gascón is turning to artificial intelligence to keep bias from influencing prosecutors. But how much of a difference can this new tool actually make? 

You’ve probably heard of implicit bias. Unconscious attitudes impact all of us, whether we’re aware of it or not. And if you’re curious about your own bias, there’s an online test you can take. 

It's called the Implicit Association Test. It was created back in the nineties by a group of researchers. I made my colleague, Holly J. McDede, take it.

White and black faces flashed across the screen. She needed to categorize the images (and it stressed her out a little). “I” key was white and “E” was Black. 

The test asks you to classify words as "good" or "bad.” Words like “cherish” or “beautiful” are “good,” words like “disgusting” or “ugly” are “bad.” 

When a “good” word popped up, she had to press the same key for when she saw a white face. The same with the “bad” word and the black face. Then, the rules changed. Black faces: good. White faces: bad.

Then we got the results. And the data showed Holly has a preference for dark-skinned people over light-skinned people. But Holly was skeptical.“But I just feel like it's, it's this is a game,” she said. “It’s leading me somewhere.”

Even though Holly doesn’t buy that this online test can detect racial bias, she admitted that taking the test made her think more about unconscious stereotypes and how bias can impact decisions every day. 

And that’s really important for people like prosecutors, who have to make split second choices all the time. They take on a lot of cases, and have a really small window to decide whether to charge someone with a crime. 

San Francisco District Attorney George Gascón spends a lot of time thinking about ways to deal with implicit bias. He says, “The process of self-awareness is really important and frankly as human beings we’re incapable of removing our biases.”

A New Tool

Before prosecutors decide whether to charge suspects, they look at police reports. And Gascón wondered what would happen if the suspect’s race were removed from these documents. But Gascón says, “to do that by hand is very difficult, very time consuming, so it became obvious to me the more that I struggled with this, there is technology to do this stuff, it's just a matter of figuring out who can be a partner.”

So he called up researchers at Stanford’s Computational Policy Lab and pitched the idea. The product had to be economically feasible “which for government agencies in this case, meant trying to find something that isn’t going to cost us any money,” quips Gascón.

And the researchers thought the idea was great. So under a tight deadline, they designed software that does a bunch of things. It redacts name, race, and hair/eye color of both the victims and suspect.

Instead of saying “asian male,” for example, the report says Person 1. And it’s in purple or another rainbow color, so you can spot the same purple person throughout the report.

Credit Courtesy of the District Attorney's office
/
Courtesy of the District Attorney's office
Pre and post examples of Redacted reports.

The software also takes out the neighborhood where the crime took place. And because it’s open source, prosecutors outside San Francisco can pick it up too. “The technology is not necessarily complex,” says Gascón. “it’s just being able to get the data and do it.”

Plus, it summarizes all the data on police reports into one tidy digital document. Before this, the docs weren’t even digitized.

Prosecutors working on felony cases in the city are all using this new tool. 

And Gascón is really excited about it. “The goal is to create a purity about our work that takes race out of the picture,” he says.

The Software Doesn't Challenge Our Bias

Rebecca Young, the co-chair of the San Francisco Public Defender’s Racial Justice Committee, has another perspective. She asks, “are we outsourcing the very difficult and unnecessarily human task of unlearning racial bias to an algorithm?”

She says relying on computers to address racial bias makes us lazy.

“It just seems like an experiment or a game almost like some tech company was like ‘Hey try our artificial intelligence algorithm on rooting out implicit bias there's no cost to you.’ Sounds like fun right?”

Young says she sees racial bias every day in jails and in the courtroom. She’s witnessed prosecutors fight to remove black men from jury pools. And in San Francisco, African Americans are less likely to have their cases dropped than white suspects. She says this technology fails to address those disparities head on. “You still haven't taken care of the problem of the people in power making these decisions. You're not training. You're not requiring a growth in awareness and consciousness.”

Gascón admits the software is headline-catching, but he’s also a prosecutor. And when you need to decide whether to charge a case right now, you need a tool to help you make the fairest decision as quickly as possible.

And this technology does give prosecutors a chance for some introspection. There is a second review process where attorneys see the original police report. This is when they review material that does show race, like police body-camera footage. If prosecutors change their minds after seeing all the info, they have to explain why. Gascón says that creates pressure to stick to the original, color-blind decision. 

And that can make all the difference. "We're, we're making decisions that can impact people's life for many years or forever,” says Gascón. “I mean, you know, we can send somebody to prison for probably for their, you know, their entire natural life, right?"

In an ideal world, bias would be easy to unlearn. But when systemic racism exists, Gascón says ignoring your own bias doesn’t work. “The problem is when you negate, or you try to deny that you have them and then you continue to act in a way that is contrary to what the best course of action should be for the community or for society, especially our work.”

When Holly took the Implicit Bias Test online, she put a lot of energy into thinking about her personal bias. “I really wanted to come across as a decent person!” she insists

But it’s easy to stop and think about implicit bias when you’re in front of a computer screen. Everyday decisions with real stakes are more complicated than online tests, or computer software that redacts race for you. 

But the technology still helps. It makes imperfect decisions a little bit more fair. 

Crosscurrents