Foreword
I am pleased to present Judge Noel L. Hillman as our guest technology columnist for this issue of The Judges’ Journal, in which he writes that judicial use of artificial intelligence (AI) at sentencing to predict a criminal defendant’s risk of recidivism is a concerning development that should be met by sentencing courts with skepticism and close scrutiny. Although the subject of AI has been addressed in past technology columns, Judge Hillman’s thoughtful discussion about reliance on AI at sentencing addresses issues of due process, reliability, and the human nature of the sentencing process that may prove prescient.
You will not be alone if, after digesting Judge Hillman’s thoughts on the use of AI at sentencing, your inner mind conjures up memories of Minority Report, a movie about “Precogs” in the year 2054 who visualize crimes before they occur. The precognition of future events demonstrated in that 2002 film classic allows law enforcement personnel to arrest and imprison would-be murderers before they commit their dastardly deeds. Yes, there is an eerie similarity between the storyline in Minority Report and judicial reliance on AI at sentencing.
Thank you, Judge Hillman, for enhancing the technology awareness of our readers.
—Judge Herbert B. Dixon Jr., Technology Columnist, The Judges’ Journal, [email protected], Twitter @Jhbdixon
Editor’s note: This is the first guest technology column the magazine has featured since Judge Dixon commenced this series in 2007 as our regular technology columnist. We invite our technology-savvy readers to consider writing a future guest technology column. If you are interested, contact the editor or Judge Dixon.
Recently, numerous states have begun experimenting with the use of artificial intelligence (AI) as a tool to predict the risk of recidivism for criminal defendants and to consider that assessment at sentencing.1 At least one defendant has raised on appeal unsuccessfully a due process challenge to the use of this new technology for such purposes.2 These experiments may presage growth of such use in critical decision-making at sentencing and reflect an increasing comfort level with such AI use among judges. A trend toward the use of AI predictive technology at sentencing is a concerning development that should be met by sentencing courts (federal and state alike) with skepticism and close scrutiny.
This article identifies at least three reasons why: (1) the use of AI at sentencing may violate basic tenets of due process, (2) current AI technology presents unacceptable risks of error and implicit bias, and (3) reliance on AI to predict recidivism improperly cedes the discretionary sentencing power to nonjudicial entities. In short, to date, the use of AI at sentencing is potentially unfair, unwise, and an imprudent abdication of the judicial function.
Due Process
Judges typically have wide latitude at sentencing to craft a sentence that reflects the policy goals in the federal sentencing statute.3 In the federal system, after the U.S. Supreme Court in United States v. Booker4 made sentencing guidelines advisory rather than mandatory, courts have largely been free to depart, or more often vary, from the guidelines so long as they articulate the factual and legal foundation for the exercise of their sentencing discretion. For many federal criminal statutes, the potential swing from the sentencing floor to the maximum sentence is on a scale measured in decades of time behind bars.5 The applicable ranges in state courts may be even broader.
Before judges may apply the legal standards set forth in the federal sentencing statute, the court must first establish the relevant facts, or, as I say in this context and others: “Facts first; law second.” However, the establishment of relevant facts for sentencing purposes is an imperfect science. While many facts are expressed or implied by a jury verdict or, more commonly, are admitted pursuant to a plea agreement and colloquy with the defendant in open court,6 most facts important to sentencing discretion are established through the Presentence Investigation and Report (PSR) process, with many states having a counterpart for state court convictions. The PSR is prepared by a probation officer and, while the process is largely transparent with input from the defendant and counsel, the officer obtains the information through ex parte interviews of third parties and a review of documents often containing multiple layers of hearsay. Matters stated as fact in the PSR are often important and even decisive in determining relevant conduct, whether a specific offense characteristic applies, or if a departure or variance from the guidelines is warranted.
Unlike facts found by a jury beyond a reasonable doubt, sentencing facts are adjudged by the court under a preponderance standard. The use of hearsay and the lower standard of proof alerts courts to proceed with caution when facts in the PSR are contested or are capable of multiple and often nuanced meanings. For these reasons and others, a significant body of case law directs federal sentencing courts to disregard, and expressly state on the record their intention to disregard, contested material facts that cannot be resolved without unduly delaying the sentencing process.7
The use of predictive AI technology heightens these concerns. An algorithm-generated risk assessment score presents itself to the court as a presumptive factual determination. In essence, predictive technology becomes another witness against the defendant without a concomitant opportunity to test the data, assumptions, and even prejudices that underlie the conclusion. A predictive recidivism score may emerge oracle-like from an often-proprietary black box.8 Many, if not most, defendants, particularly those represented by public defenders and counsel appointed under the Criminal Justice Act9 because of indigency, will lack the resources, time, and technical knowledge to understand, probe, and challenge the AI process. A lawyer can review a police report that underpins a PSR or send an investigator to question a witness who provided information to the probation officer. By contrast, issuing a subpoena to a software developer to be questioned at a sentencing hearing (and then having the knowledge and expertise to effectively question that witness at sentencing) seems an unlikely and unwieldy solution to a real problem. While there is no Sixth Amendment right of confrontation at sentencing, basic tenets of fairness and due process should still govern a process that profoundly impacts liberty through incarceration.
Reliability
Separate and apart from concerns over procedural due process is the issue of whether recidivism models are based on the proper characteristics and factors and are objectively reliable. There is good reason to question whether they are. While machine learning continues to improve and the very nature of AI hints at structured, rigorous, and evolving methodologies, one is reminded of the adage of “garbage in, garbage out.” We are, of course, well past IBM punch cards and, hopefully, largely past human input error. However, recidivism risk modeling still involves human choices about what characteristics and factors should be assessed, what hierarchy governs their application, and what relative weight should be ascribed to each.
While one assumes that the very nature of AI is to learn and adjust programmatic output as the data set expands, this highlights rather than alleviates the concern in the sense that each recidivism score is the best one can hope for at any given time. It would be cold comfort to a defendant facing a high recidivism score to know that future defendants will have a score based on an improved and, hence, more accurate program or, worse, that an earlier version of the software had been shown to incorporate bias.10
Human Aspects of the Sentencing Process
When done correctly, the sentencing process is more art than science. Sentencing requires the application of soft skills and intuitive insights that are not easily defined or even described. Sentencing judges are informed by experience and the adversarial process. Judges also are commanded to adjust sentences to avoid unwarranted sentencing disparity on a micro or case-specific basis that may differ from national trends.11
The final sentencing hearing itself is an opportunity to round out and shed light on the factual and legal determinations of the PSR. Sentencings often are attended by family members and other supporters of the defendant. In addition to letters written on the defendant’s behalf, supporters may give testimony and proffer facts relevant to assessing the risk of recidivism, such as housing stability, nurturing and protective family relationships, and offers of future employment. Prosecutors, too—who may have conducted proffers with the defendant, interviewed witnesses and co-defendants who knew the defendant well, and most likely oversaw the process of identifying, investigating, and prosecuting the case as a whole—often have valuable insights, both positive and negative, into recidivism risk. Victims, also, have due process rights and may testify as well on matters equally relevant to recidivism risk.
Significantly, the defendant has a right of allocution at sentencing.12 This right is exercised routinely, and the defendant may be asked questions directly by the sentencing judge. Just as judges instruct juries that they should assess a witness’s credibility as they would anyone else they encounter in everyday life, judges make credibility determinations about statements of contrition and remorse—character traits directly relevant to recidivism risk. Importantly, these assessments come at the time of sentencing, after the gathering of data and application of AI-based risk assessment tools, and are therefore not reflected in these scores. These critical components of the sentencing process, which cannot be reduced to a precise formula, mean that sentencing should not be merely the function of an algorithm or electrons coursing through integrated circuits. It is a uniquely human and dynamic endeavor.13
Conclusion
I am not a Luddite. I appreciate what AI allows and will allow in the future. AI will continue to find meaningful and helpful applications both outside and inside the law.14 The general concept of Moore’s Law as applied to AI means that AI will improve exponentially, and when AI learns to improve itself (as opposed to merely improving on the results of a designated task), there may be few limits to its use.15 But until that day comes, sentencing should continue to be counseled by hope, caution,16 wisdom, critical inquiry, experience, and an intuitive and subjective assessment of an individual defendant’s industry, credibility, contrition, relationships, challenges, and promise.
Liberty is a precious right, and its preservation a basic goal of our Constitution. For a defendant facing sentencing in the federal system, this means a sentence no longer than necessary to deter future criminal activity.17 For the public at large whose rights to an effective criminal justice system are equally important, liberty means a sentence sufficient18 to preclude the defendant from denying other more law-abiding citizens their right to safety and security in a free society.19 For now, striking that difficult balance is best left to man and not machine. n
Endnotes
1. I would like to thank Prof. Francis McGovern of Duke Law School; the Hon. Lee Rosenthal, Chief Judge of the U.S. District Court for the Southern District of Texas; and Jeff Ward, director of the Duke Center of Law and Technology, for inspiring the thoughts expressed in this article, which along with any errors or omissions are the author’s alone. For a thoughtful and more comprehensive discussion of these issues, see Danielle Kehl, Priscilla Guo & Samuel Kessler, Algorithms in the Criminal Justice System: Assessing the Use of Risk Assessments in Sentencing 15–16 (Responsive Cmty. Initiative, Berkman Klein Ctr. for Internet & Soc’y, Harv. L. Sch. July 2017), available at http://nrs.harvard.edu/urn-3:HUL.InstRepos:33746041. Some states require the use of evidence-based tools at sentencing by statute, while others merely permit its consideration. Id. (collecting state statutes). Francis X. Shen, Neuroscience Evidence as Instant Replay, J.L. & Biosciences at 5 (Minn. Legal Studies Research Paper No. 16-23, July 12, 2016) (predicting the increased use of neuroscience at sentencing phase of criminal proceedings).
2. State v. Loomis, 881 N.W.2d 749 (Wis. 2016).
3. Gall v. United States, 552 U.S. 38, 46 (2007) (“Our explanation of ‘reasonableness’ review in [United States v. Booker] made it pellucidly clear that the familiar abuse-of-discretion standard of review now applies to appellate review of sentencing decisions.”). Proper sentencing goals are just punishment, general and specific deterrence, and, where appropriate, rehabilitative care. 18 U.S.C. § 3553(a).
4. 543 U.S. 220 (2005).
5. A good example is the federal drug laws. Depending on the type and amount of drugs involved, a defendant may face a statutory sentencing range of 0 to 20 years, 5 to 40 years, or 10 years to life. 21 U.S.C. § 841(b)(1)(C), (B) & (A), respectively. Within those ranges, federal sentencing guidelines allow for the consideration of uncharged conduct that may be deemed “relevant” for sentencing purposes. U.S.S.G. § 1B1.3(a)(2) (allowing consideration in certain instances “all acts and omissions . . . part of the same course of conduct or common scheme or plan as the offense of conviction”). Sentencing facts also are used to determine the application of specific offense characteristics that may have a significant impact on the advisory sentencing guideline range. See, e.g., id. § 2D1.1(b)(1) (increasing offense level in drug case “if a dangerous weapon (including a firearm) was possessed”).
6. Fed. R. Crim. P. 11(b)(3) (“[b]efore entering judgment on a guilty plea, the court must determine that there is a factual basis for the plea”).
7. See United States v. Webster, 788 F.3d 891, 892 (8th Cir. 2015) (“A PSR is not evidence and not a legally sufficient basis for findings on contested issues of material fact. If the PSR’s factual allegations are objected to, the government may prove relied-on and contested facts. Then, the court must either make findings by a preponderance of the evidence or disregard those facts.”).
8. See Kehl, Guo & Kessler, supra note 1, at 8 (noting incentives for private companies to keep their algorithms secret). See also State v. Loomis, 881 N.W.2d 749, 769–70 (Wis. 2016) (COMPAS status as a proprietary tool prohibits disclosure of method of calculation). COMPAS, the risk assessment tool at issue in Loomis, is used in Wisconsin, Florida, and Michigan. See Kehl, Guo & Kessler, supra note 1, at 11.
9. 18 U.S.C. § 3006A. The Criminal Justice Act provides federal funds for private attorneys, experts, and services necessary for the adequate representation of indigent defendants where the federal public defender’s office has a conflict of interest.
10. See Kehl, Guo & Kessler, supra note 1, at 28 (citing study raising serious questions about COMPAS and its use of potentially biased data sets). See also Loomis, 881 N.W.2d at 769–70 (suggesting COMPAS may be biased against minority defendants).
11. 18 U.S.C. § 3553(a)(6) (sentence imposed should “avoid unwarranted sentence disparities among defendants with similar records who have been found guilty of similar conduct”). Consistency is a double-edged sword. A sentence must be proportional but also based on individual characteristics. See also Kehl, Guo & Kessler, supra note 1, at 18 (noting the defendant in Loomis argued the use of AI favored national statistics over his individual circumstances).
12. Fed. R. Crim. P. 32(i)(4)(A)(ii).
13. See 18 U.S.C. § 3661 (“In determining the sentence to impose within the guideline range, or whether a departure from the guidelines is warranted, the court may consider, without limitation, any information concerning the background, character and conduct of the defendant, unless otherwise prohibited by law.”).
14. Evidence-based risk assessment tools have been widely used and are particularly well-suited for designing effective rehabilitation programs. Kehl, Guo & Kessler, supra note 1, at 9–10. Although the concerns for such use may mirror those for sentencing as both involve a potential deprivation of liberty, such tools are increasingly used to assess pre-trial risk for flight and danger to the community. Id. at 10.
15. Your author does not refer to the technical application of Moore’s Law to AI. Moore’s Law, which famously predicted the doubling of the computing power of integrated circuits every 18 months, may be slowing in application as chips approach physical limits and actually may be functioning as a check on the growth of AI, which requires ever-increasing computational power and storage capacity. See Moore’s Law, Wikipedia, https://en.wikipedia.org/wiki/Moore%27s_law (noting transistors will eventually reach the limits of miniaturization at atomic levels). Rather, I refer to an analogous exponential growth in the sophistication of AI that not only results in improved functionality but that also allows for self-replication, self-repair, and self-transformation to new functionalities. Some futurists such as Elon Musk view this as an alarming prediction. See Camila Domonoske, Elon Musk Warns Governors: Artificial Intelligence Poses “Existential Risk,” NPR (July 27, 2017), https://www.npr.org/sections/thetwo-way/2017/07/17/537686649/elon-musk-warns-governors-artificial-intelligence-poses-existential-risk (predicting dire consequences from unregulated AI). If Musk is correct, the question will not be whether judges should use AI but whether we will need judges at all. For a different perspective, see Bill LaPlante & Katharyn White, Five Myths About Artificial Intelligence, Wash. Post, Apr. 27, 2018, at 3 (noting positive impact of AI and questioning Musk’s doomsday prophecies).
16. See Owen D. Jones & Francis X. Shen, Law & Neuroscience: What, Why, and Where to Begin—A Knowledge Brief of the MacArthur Foundation Research Network on Law and Neuroscience 2 (2017) (opining the best approach to the use of neuroscience in court is to be both informed and cautious).
17. 18 U.S.C. § 3553(a) (“The court shall impose a sentence sufficient, but not greater than necessary, to [advance certain enumerated statutory goals].”).
18. Id. Often referred to as the statute’s parsimony phrase, 18 U.S.C. § 3553(a) is actually a double-edged sword that counsels a sentence “not greater than necessary” but also “sufficient.”
19. One of the enumerated goals of the federal sentencing statute is a sentence that “protect[s] the public from further crimes of the defendant[.]” Id. § 3553(a)(2)(C).