The controversial but growing practice of using computers and algorithms to offer advice to judges, parole officers and other officials as they make decisions about individuals’ fates in the criminal justice system was the subject of a lively discussion on Feb. 15 during the American Bar Association Midyear Meeting in Austin, Texas.
Featuring a panel of judges from three states and other experts, “The Good, the Bad and the Ugly: The Validity of Risk Assessment Tools in Setting Bail and Drafting Sentencing in the Criminal Justice System” also featured plenty of feedback from the audience of judges, prosecutors and defense attorneys about whether these new tools help, or hurt, the cause of justice.
Artificial intelligence systems crunch massive amounts of data – such as past conviction records of defendants in the criminal justice system, employment data, age, re-arrests resulting in a conviction – to offer advice on the risk that a particular defendant will commit another crime or skip out on bail.
But while supporters say the systems help judges rely on more than educated guesses in deciding what happens to individual defendants, critics say they are ineffective, inaccurate and perpetuate the well-documented bias in the criminal justice system against low-income people and people of color.
In a recent poll by the National Judicial College of 369 judges, a clear majority – 65% -- agreed that artificial intelligence could be a useful tool for combatting bias in bail and sentencing decisions, but it should never completely replace a judge’s discretion.
But the new systems also drew expressions of concern, according to the poll. Some judges expressed misgivings that the AI systems themselves have biases baked in because they rely on data where low-income defendants and people of color may be overrepresented because of past arrests.
“A lot of times, this machine becomes your witness,” said panelist W. Milton Nuzum III, director of judicial and education services at the Supreme Court of Ohio. “And how do you cross examine a machine if you’re a prosecutor or a defense attorney? I don’t know.”
At the panel discussion moderated by Stephanie Domitrovich, a Pennsylvania state trial judge, Mark Bergstrom, executive director of the PA Commission on Sentencing, presented the state’s new Sentence Risk Assessment Instrument, which he said is more transparent than proprietary systems developed by private companies that use undisclosed algorithms. The Pennsylvania tool is scheduled to be implemented in July 2020.
While the system rates defendants as high or low risk for re-offending, that information is not given to judges, he said. Instead, in high-risk cases, judges will receive a note on their regular pre-sentencing guidance reports alerting them that “additional information recommended.”
Christina Klineman, an Indiana state trial judge, said her state has just begun using evidence-based risk assessments to help judges make bail decisions. The pretrial Indiana Risk Assessment System includes seven main factors, including whether the arrestee was employed at the time of arrest, whether there have been three or more prior jail incarcerations and whether there is a “severe” illegal drug use problem. Research, she says, show these factors are the best predictors of risk.
Tools like these “are intended to aid judges,” she said. “I think where the concern is when they replace judges’ discretion.”
In Ohio, the Department of Rehabilitation and Correction collaborated with the University of Cincinnati to develop an offender risk assessment tool in 2006. It is used not only in Ohio but also in other jurisdictions around the country, said Guy Reece II, an Ohio state trial judge. Employees using the system must be trained and certified.
In Ohio’s Franklin County, Reece said that officials used the system to institute intensive training of probation staff based on risk assessments to make sure the staff were appropriately assessing and supervising those who were on probation.
As a result, caseloads for probation officers were reduced more than half and the length of time they spent with those on probation rose significantly.
“It’s costly but hopefully in the end it will result in folks being properly supervised – those needing or requiring assistance so that they don’t reoffend will receive the assistance they need,” said Reece, “and that’s just one way of looking at evidence- based practices, so you can really assess folks and give them the assistances they need.
But a discussion of algorithms predicting individual human behavior drew words of caution from another panelist, Alicia L. Carriquiry, director of the Center for Statistics and Applications in Forensic Evidence at Iowa State University.
While Carriquiry said she is a believer in well-constructed assessment tools, noting they are used in the medical field, she said: “I think that it is safe to say that the criminal justice system has more faith in algorithms than we statisticians that construct them have. Predicting the behavior of a group is easy, and we can do that….[but] when you try to predict an individual’s behavior, the error becomes gigantic to the point of being useless.”
She added: There are no easy answers.”