Artificial intelligence–based instruments, in the form of risk assessments, are entering our federal and state courtrooms to assist in sentencing and bail decisions. Risk assessments have a central role in sentencing and ongoing bail reforms and in assisting judges to make more objective decisions.1 Risk assessments rely on statistical correlations between a group trait and that group’s criminal offending rate derived from empirical research. For instance, low-risk offenders can be identified and diverted for alternative community-based consequences rather than imposing expensive, lengthy prison terms. Legislators and executive branch leaders are advocating for more use of these tools, and the courts are responding in various ways.
February 03, 2020 Feature
Artificial Intelligence Stepping into Our Courts: Scientific Reliability Gatekeeping of Risk Assessments
By Judge Stephanie Domitrovich
At the federal level, the Formerly Incarcerated Reenter Society Transformed Safely Transitioning Every Person Act is “perhaps the most far-reaching federal sentencing reform in a generation, mentions risk no less than 100 times and relies on risk assessments to allocate prison programming and prisoner release.”2 State legislatures are mandating use of risk assessments. For example, the Pennsylvania legislature mandated in 2010 that the Pennsylvania Commission on Sentencing adopt sentence risk assessment instruments for use by its courts. After almost a decade of development, the Pennsylvania Commission recently presented its instrument for judges to evaluate relative risks of criminals reoffending.
Measure outcomes used are derived from statistical models based on factors such as age, gender, prior convictions, prior conviction offense types, current conviction offense types, multiple current convictions, and prior juvenile adjudications. When predicting the accuracy of female recidivism, gender was included to avoid incorrectly listing women as high-risk offenders, where the ratio of females at 20 percent is substantially lower than males at 80 percent. If gender was not considered, fewer females would be classified as low-risk offenders.
Other states are using artificial intelligence in their efforts on bail reform. In Indiana, cash bonds have hindered indigent individuals, who are overrepresented in the criminal process, from being released pretrial. The Indiana Office of Court Services has gathered data in its pretrial tools to shift judicial resources from money for bail release to practices correlating with community safety, appearances at future hearings, and reduction of recidivism.3 However, New York State in its cash bail reform took an opposite approach by ensuring risk assessment tools are not used to determine release and by not permitting judges to have “discretion to assess the dangerousness of the accused and whether releasing them would be a threat to the community.”4 Law enforcement officials have expressed much concern over this new law limiting cash bail and are now advocating for repeal of this legislation because “dangerous people are being let out as they await the adjudication of their cases.”5
Although risk assessments are touted as valuable assets to reduce recidivism, we as lawyers and judges must scrutinize their scientific underpinnings. Naysayers raise concerns over the scientific reliability of including racially biased data and algorithms in our criminal justice system. For instance, Pennsylvania senators recently introduced a bill to repeal the Pennsylvania Commission’s risk assessment tool on allegations its “algorithm likely has a disparate impact on race due to the utilization of arrest records amassed during a history or over policing in communities of color.”6 Proponents of these tools argue that, despite race or class bias inherent in data, these impersonal, data-driven risk assessment tools increase fairness in discretionary sentencing and improve our current system where judges and prosecutors have unknown biases based on anecdotal opinions or worse.7
We as judges and lawyers have the responsibility of preventing unreliable science from being used in our courtrooms. We must be at the forefront of ensuring the validity of risk assessment instruments for scientific reliability before they are used in court. Validation studies should be conducted on data gathered from the court’s use of these instruments. Experts can advise us on how these instruments are performing in our courts as well as suggesting remediable ways to address any scientific reliability issues.8
Several best practices for validating risk assessments are recommended:
- Hire an independent party to conduct these validation studies by applying multiple statistical tests;
- Examine race and gender as to each item to separate out factors leading to bias;
- Locate specific causes of any racial and gender bias with multipoint inspection and quality protocol checklists as to human error in administering the tool or the weight given to factors contributing to unintended biases;
- Perform inter-rater reliability activities to highlight scoring inconsistencies among assessors;
- Develop focus groups to review how protocols are actually scored in practice versus written policies;
- Prepare plans according to testing results to remediate bias in race and gender in the tool;
- Review the assessors’ performances by creating protocols to train and monitor assessors by first-hand observations or recordings; and
- Provide staff with updated training to “booster” their training with current state-of-the-art protocol to prevent inconsistencies in scoring.9
Because many of us did not attend law school due to strengths in science, math, or technology, judges and lawyers must be offered educational seminars and training to open or close the doors in admitting scientific evidence. To assist us in this gatekeeping role, the Judicial Division Programming Committee for the 2020 ABA Midyear Meeting in Austin, Texas, has chosen the National Conference of State Trial Judges’s program entitled “The Validity of Risk Assessment Tools in Setting Bail and Drafting Sentencings in the Criminal Justice System: The Good, the Bad and the Ugly for Judges and Lawyers.” Please join us on Saturday, February 15, 2020, at the J. W. Marriott to learn more about this important topic to both lawyers and judges.
Endnotes
1. Jordan M. Hyatt, Mark H. Bergstrom & Steven L. Chanenson, Follow the Evidence: Integrate Risk Assessment into Sentencing, 23 Fed. Sentencing Rep. 266 (Apr. 2011), https://fsr.ucpress.edu/content/23/4/266.
2. Brandon Garrett & John Monahan, Assessing Risk: The Use of Risk Assessment in Sentencing, 103 Judicature, no. 2, Summer 2019, https://judicature.duke.edu/articles/assessing-risk-the-use-of-risk-assessment-in-sentencing.
3. Mary Kay Hudson, Update on Evidence-Based Pretrial Practices, Ind. Ct. Times (Jan. 15, 2019), http://indianacourts.us/times/2019/01/pretrial-part2.
4. Joseph Spector, New York’s Limit on Cash Bail Stirring Controversy Across Communities, Observer-Dispatch (Utica, N.Y.) (Jan. 12, 2020), available at EfficientGov Staff, EfficientGov (Jan. 12, 2020), https://efficientgov.com/blog/2020/01/12/new-yorks-limit-on-cash-bail-stirring-controversy-across-communities; H. Rose Schneider, How NY’s Bail Reform Laws Stack Against Other States, Observer-Dispatch (Utica, NY) (Dec. 16, 2019), https://www.uticaod.com/news/20191215/how-nyrsquos-bail-reform-laws-stack-against-other-states.
5. See Spector, supra note 4.
6. Memorandum for Senate Co-sponsorship from Sen. Sharif Street (Pa. Senate, Sess. of 2019–2020 Reg. Sess.), Repealing the Mandate to Develop a Pre-Sentencing Risk Assessment Tool (Mar. 4, 2019), https://www.legis.state.pa.us/cfdocs/Legis/CSM/showMemoPublic.cfm?chamber=S&SPick=20190&cosponId=28630; see also News Release, Rep. Joanna E. McClinton (Pa. House Democrats), Legislators Vow to Fight Pa. Commission on Sentencing’s Risk-Assessment Tool (Sept. 6, 2019), https://www.pahouse.com/InTheNews/NewsRelease/?id=110318.
7. Charlotte Hopkinson, Using Daubert to Evaluate Evidence-Based Sentencing, 103 Cornell L. Rev. 723, 725–26 (Mar. 2018).
8. CSG Just. Ctr. Staff, Three Things You Can Do to Prevent Bias in Risk, Council of State Gov’ts: Justice Ctr. (July 20, 2016), https://csgjusticecenter.org/jr/posts/three-things-you-can-do-to-prevent-bias-in-risk-assessment.
9. Id.