To learn more about this topic, visit AL.Law
Mike Papantonio: A computer system is being used against individuals that actually manipulates decisions on who can be set free at every stage of criminal justice processing, from assigning bond amounts to the defendant’s freedom. So far, states like Arizona, Oklahoma, Virginia and Wisconsin are providing these computer results to judges during criminal sentencing to affect the judge’s decision.
Joining me to talk about this is RT Correspondent, Brigida Santos. Brigida, they call this new application a risk assessment, where a computer’s making basically a judge’s decision. Give me your take on this. This is moving into weird territory if you’re a criminal defendant these days.
Brigida Santos: Absolutely, this is so bizarre. It reminds me of Minority Report. What these risk assessments are, they’re algorithms that provide an assessment of whether a defendant is likely to commit a crime in the future. So they have not committed that crime yet and yet these algorithms are being used by prosecutors across the nation to determine a defendant’s sentencing, whether it’ll be harsh or whether it’ll be a long one and they’re really, really bad at predicting the outcome and they’re incredibly biased. In fact, they were wrong 80% of the time when they falsely said that people were likely to commit a violent crime in the future, which means they only got it right 20% of the time. Now when it comes to determining whether someone is likely to commit a lesser crime, it was a little bit closer in accuracy, about 60%, so slightly over a coin toss, certainly not good enough to justify sending someone to jail for a long time or giving them a harsher sentence, Mike.
Mike Papantonio: Wow, we have computers making decisions about how, what the projection of a person’s future is, that’s really where this lands. The US Sentencing Commission has asked to investigate this but reports say that they never did. They said, we want to take a look at this before you unfold this, before you put it out there, we want to understand all the intricacies. We want to understand those numbers, for example, that you just talked about. Is there any reason for this about what happened, what was the breakdown here, why was there no investigation?
Brigida Santos: So it’s unclear why there was no investigation. Now, back 2014, then Attorney General Eric Holder is the one who asked for this report. So, since then, nothing has been done. It’s likely that maybe it fell by the wayside with Attorney General Jeff Sessions coming into play here, but we really don’t know why there has been no investigation, which is why ProPublica did this investigation. And of course they are an independent media watchdog, they are a media outlet, and they conducted this investigation independently, on their own, Mike.
Mike Papantonio: So the way sentencing takes place is you have the defendant, makes their case in front of the judge, they tell about their past history, they tell the intricacies of what actually happened, there’s no computer to assess that, the judge has to make that assessment. The threat here is that judges fall into this realm where this is what they go by, and that way when they’re asked, well gee whiz, why did you give this sentence or that sentence, and they happen to be running for office again, they blame it on the computer. Look, Northpointe is the company that created this artificial intelligence risk assessment. They dispute that there’s anything biased with their computer’s analysis. What methods do they use to determine for a judge, high risk where it comes to judging a repeater offender for example? What’s your take on that?
Brigida Santos: So Northpointe of course is a for-profit company so they are gonna criticize any report that could threaten their bottom line as this ProPublica one has. So they criticize it, they do not agree with the methodology, but they refuse to disclose exactly how it is that they conduct these assessments. They will not disclose how they calculate it. What we do know, based on this 137 question survey is that things like education level, employment status, a defendant’s parent’s criminal history are factored in, as well as education levels. We also know that certain questions are asked to determine whether a defendant says that they would be likely to steal in the future based on whether they are poor or not, but it does not specifically outright ask about race, Mike.
Mike Papantonio: Is there anything out there that explains what the point of having the computer determine whether a person’s gonna have a criminal future, is there anything out there that says this is a bad idea? It almost seems like this goes against the idea of innocent until proven guilty, this AI method looks like it’s saying, you know, you’re gonna be guilty and you can try to convince me that you’re not gonna be guilty down the road, that you’re not gonna be in front of me again, but the computer says otherwise. What’s your take?
Brigida Santos: Again, this is just like Minority Report. Artificial intelligence is a new and emerging technology, we really don’t know what the long term consequences of implementing it in the criminal justice system are going to be. And in fact, many new studies have found artificial intelligence is racist, sexist and even classist. Now this is not something that is done intentionally by the people who design it, but again, 75% of all scientists and engineers in the United States are white men, so they’re likely unintentionally injecting their own unconscious bias into these programs that they’re creating and therefore, you’re getting this odd outcome. Elon Musk as also warned that AI poses the greatest existential threat to humanity because of the fact that AI learns from the world around it, which means it’s also learning all of the flawed things that humans have learned as well, Mike. So this is very troubling. It’s really sad to see that this is determining a future.
Mike Papantonio: That’s scary in and of itself, Brigida, that last statement.
Brigida Santos: Absolutely.
Mike Papantonio: But statistically how off where these computer assessments when determining repeated offenders? If you were to give me just an overall number, what is your, would this be ever something that we’re gonna say, we’re safe building into this into the process, it’s right this number of times. You started off by giving us those numbers, circle back on that and tell us one more time what the failure rate is.
Brigida Santos: All right, so whites are labeled to be low risk far more often than blacks, and in fact, black defendants are labeled as likely to re-offend, falsely double the rate, so twice as much. So that’s very bad. And in plea bargaining outcomes, it’s even more troubling because it’s actually having the greatest disparities between race when defendants have no prior criminal history, which means that even though somebody has not committed a crime in the past, there’s no conviction, this is labeling them as if have already committed one in the future.
Mike Papantonio: Wow.
Brigida Santos: So that’s what’s going on right now in criminal justice.
Mike Papantonio: Well, Brigida, thanks for joining me, okay?
Brigida Santos: Thank you.