chevron-down Created with Sketch Beta.


The Prospects of Constitutional Challenges to COMPAS Risk Assessment

Christopher Miller


  • Northpointe, the original developer of COMPAS, has changed its name to Equivant, and it seems to have reconceptualized its marketing by taking the focus away from the fact that COMPAS is a racially biased algorithmic risk assessment tool by presenting itself as a company that develops “software for justice.” 
  • This re-marketing is the sort of soft change that we can expect from limiting our social regulation of these technologies to constitutional challenges in court.
  • To ensure that functions traditionally assigned to the state do not fall into the hands of market-driven software developers, there needs to be more regulation by experts in law and technology prior to the induction of these products into our criminal legal system, where they will disproportionately affect the liberty interests of those most vulnerable.
The Prospects of Constitutional Challenges to COMPAS Risk Assessment
Hans Neleman via Getty Images

State-Sponsored Market Fundamentalism in Technological Development

Ideas about privacy and the role of technology in fostering privacy are socially and culturally determined. It is extremely important that technology remain open to democratic intervention and regulation. Just as in any other sector of social life, technology should not be allowed to self-validate its own totality over the flow of information. It is widespread knowledge that these companies are mostly self-regulated in their efforts to collect as much data about us as possible and sell those data to the highest bidders. However, it is not simply big business seeking the data that we create in our interactions with online platforms. The state is also major client. By pandering to the state’s infatuation with surveillance and preemptive risk assessment, tech developers stand to make private fortunes in developing precision tools to help the state hold individuals captive. Despite the fact that systems of surveillance and risk assessment existed prior to the eminent model by which we know them today, on computer screens, there is something disheartening about seeing the power of technology squandered on perpetuating systems of oppression. There is no reason why technology cannot be harnessed to create alternatives; it is just a matter of incentivizing such ingenuity.

Privacy in the Criminal Legal System

In our technological era, the association of privacy with protection against pesky advertising and public relations schemes is common. It is less common to associate privacy with the protections afforded to an individual charged with committing a criminal offense or convicted of such an offense under the U.S. Constitution. However, technologies are certainly being developed that implicate the rights afforded in criminal procedure.

One such technology is Correctional Offender Management Profiling for Alternative Sanctions (COMPAS). This is a commercially available predictive algorithm used in different jurisdictions across the country to assist judges in pretrial release, sentencing, and parole determinations. By providing the presiding judge with a risk assessment report, the proprietors claim that the judge is better equipped to incorporate mitigating factors and alternative sanctions while eliminating unwarranted sentencing disparities. This is a linear development building on the implementation of the Sentencing Reform Act of 1984, which outwardly claimed that its primary goal was to eliminate outright animus from sentencing (however, by failing to create exceptions for low-level offenders, this reform was a major force behind the harms enacted upon the Black community through mass incarceration and the war on drugs, thereby reinforcing racial disparities in prison populations). COMPAS holds itself out as a way for the judiciary to remedy the harm of mandatory minimums, ensuring that judges apply mitigating factors and alternative sanctions where they exist.

Studies have shown that COMPAS is no more effective at predicting recidivism than a human using traditional statistical methods for assessing recidivism. This may lead one to question whether we are simply taking offense to the image of a computer doing the work for which a statistical analyst would have otherwise been responsible. However, across various practice areas—like housing and credit and lending—it is clear that predictive algorithms statistically discriminate against minority defendants. See also Bernard E. Harcourt, “Risk as a Proxy for Race: The Dangers of Risk Assessment,” 27 Fed. Sent’g Rep. 237 (2015) (remarking on discriminatory effects of actuarial instruments in parole as an extension of the failures of selective incapacitation). In analyzing the risk of recidivism between Black and white defendants as measured by the COMPAS algorithm, researchers found that white defendants were more likely to be falsely categorized as low-risk. In contrast, Black defendants are more likely to be falsely categorized as high-risk.

While the discriminatory impact of risk assessment tools may not be all that different from their analogue predecessors, within the criminal legal system, COMPAS reports concerning prisoners have become a veil behind which constitutionally fraught determinations are being made. Criminal law jurisprudence generally provides a low standard of respect for privacy after an individual has been charged with a crime, which sinks even lower after that person has been convicted. However, some constitutional protections exist for people who have been charged or convicted of a criminal offense. These opaque determinations issued across the nation by an inaccessible algorithmic framework have created friction with those constitutional principles that is yet to be resolved.

Constitutional Issues with COMPAS at Sentencing

The doctrine of sentencing exceptionalism—i.e., the idea that a sentencing court need not recognize the substantive constitutional limits placed on other aspects of a criminal proceeding because it would interfere with the judiciary’s ability to impose a proper individualized sentence—has been problematic in the past and continues to present an obstacle to due process in the era of predictive algorithms like COMPAS.

For instance, the Fifth Amendment’s guarantee against double jeopardy prohibits punishing someone twice for the same offense or attempting to punish someone twice for the same offense. See Witte v. United States, 515 U.S. 389, 396–97 (1995). However, at sentencing, many jurisdictions allow judges to consider acquitted conduct and uncharged criminal conduct. See United States v. Watts, 519 U.S. 148, 156–57 (1997). This becomes even more complex by further limiting the defendant’s capacity to challenge the sentencing decision. The acquitted conduct is not directly accessible to the defendant; it is available only through a computer-generated report that does not include an explanation of how it came to its decision.

Moreover, in criminal law, it is a principle of due process that individuals be given notice of prohibited activity. The government may punish individuals only for past acts, and if a statute does not provide proper notice as to the conduct to be avoided, it may be unconstitutional. See Smith v. Goguen, 415 U.S. 566, 581–82 (1974). And yet, a major purpose of the COMPAS program is to assign sentencing enhancements on the basis of predictions of future dangerousness. The potential for being labeled a habitual offender and that status going on to affect a person’s sentencing existed prior to the inception of COMPAS. Still, now these biases are built into the programmatic architecture of sentencing.

Finally, when a sentencing judge departs upward of the applicable guidelines’ sentencing range, the judge must provide “the parties reasonable notice that it is contemplating such a ruling.” See Burns v. United States, 501 U.S. 129, 138–39 (1991). Failure to do so may conflict with due process. Given that COMPAS reports influence a judge’s decision about whether to enhance an individual’s sentence if the report results in a decision to depart upward from the guidelines, defendants should have a right to notice.

Loomis and Henderson

In July 2016, the Supreme Court of Wisconsin issued a decision in State v. Loomis. The defendant, Loomis, sought to challenge the use of a COMPAS risk assessment report at sentencing, claiming, based on the proprietary nature of COMPAS, that he was prevented from assessing the report’s accuracy in violation of due process. The court found that the defendant’s due process right to be sentenced on the basis of accurate information was not violated because (1) the COMPAS report contained information based on the defendant’s answers to questions, along with other publicly available information; and (2) the defendant had access to the presentence investigation (PSI) report, which contained the defendant’s assigned risk score. However, the court did set out certain limitations:

Any Presentence Investigation Report (“PSI”) containing a COMPAS risk assessment filed with the court must contain a written advisement listing the limitations. Additionally, this written advisement should inform sentencing courts of the following cautions as discussed throughout this opinion:

  • The proprietary nature of COMPAS has been invoked to prevent disclosure of information relating to how factors are weighed or how risk scores are to be determined.
  • Because COMPAS risk assessment scores are based on group data, they are able to identify groups of high-risk offenders—not a particular high-risk individual.
  • Some studies of COMPAS risk assessment scores have raised questions about whether they disproportionately classify minority offenders as having a higher risk of recidivism.
  • A COMPAS risk assessment compares defendants to a national sample, but no cross-validation study for a Wisconsin population has yet been completed. Risk assessment tools must be constantly monitored and re-normed for accuracy due to changing populations and subpopulations.
  • COMPAS was not developed for use at sentencing, but was intended for use by the Department of Corrections in making determinations regarding treatment, supervision, and parole.

State v. Loomis, 371 Wis. 2d 235, 264–65 (2016).

The court’s opinion in Loomis essentially implemented a mandatory disclaimer on the practice of using a COMPAS risk assessment at sentencing.

This disclaimer would take on more importance in the matter of Henderson v. Stensberg, No. 18-cv-555-jdp (D. Wis. 2020). The plaintiff, Henderson, is an inmate in Green Bay, Wisconsin. He claims that the defendants, Department of Corrections employees, violated his rights under the Equal Protection Clause of the Fourteenth Amendment when they continued to support the use of COMPAS for parole decisions, even though they were aware that the program had been shown to be biased against African Americans. The Department of Corrections sought to update the software, but the developers of COMPAS would not perform the update without additional payment. The posture of this case is relatively undeveloped. Still, the district court judge has denied the Northpointe defendants’ motion to dismiss, stating that the Loomis decision raised “concerns regarding how a COMPAS assessment’s risk factors correlate with race.” This decision is significant because it could build precedent for recognizing equal protection violations in the use of racially biased predictive algorithms.

Demand for Regulation in Technology That Affects the Administration of Traditional State Functions

The regulation of new technology is a contested subject. Some believe that traditional command-and-control regulation would be inappropriate in emerging fields, creating a regulatory framework that could not keep pace with the development of technology. Advocates of this position see hybrid forms of self-regulation as optimally suited to overcome the resource asymmetry between the state and private industry regarding experience and expertise in these specialized fields. However, while this argument about the inability of the regulatory state to keep up with the pace of emerging science may hold true when it is applied to fields like nanotechnology and gene modification, the science behind programs like COMPAS is not so complex as to hinder legal professionals from understanding the architecture. It operates on socio-legal values that are not foreign to legal professionals; therefore, there is less likelihood of the regulatory disconnect that is feared in regulation of other new technologies.

To say that the state is prevented from regulating technology, such as COMPAS, because of a resource asymmetry is misleading because the state has a wealth of resources related to recognizing and implementing due process, an essential element of all criminal proceedings. In contrast, Northpointe/Equivant merely has the technical knowledge to design what is essentially a file management system. The state is not at the mercy of COMPAS’s developers to know what is wrong with such a system. This is a specific instance where those in traditional regulatory positions have the precise experience and expertise to devise a workable solution.

The Henderson matter is an example of the failure of a contractual model of regulation, where the state and the private service provider negotiate terms, and if the private service provider fails to perform, it is assumed that the state may either take legal action to enforce the contract or walk away from the contract without consequence. The state contracted with a company that provided a racially biased product for making sentencing and parole determinations; when the state sought to update the software to remove this bias, the contractor demanded more money. The state refused to pay and now it is being charged with an equal protection violation for continuing to use such a program. The contractual model failed because, unless the state could show that by providing the state with a racially biased product Northpointe/Equivant failed to perform the terms of the initial agreement, the state would be without legal recourse to enjoin Northpointe/Equivant to perform the update without additional costs. Further, the state didn’t account for the coercive power held by the contractor, resulting from the fact that all the state’s criminal administrative records were already within that system, placing a severe burden on the state to replace that system.

Fault could be attributed to state authorities for failing to incorporate the proper protective conditions in the state’s contract, yet this could also be interpreted as a sort of extortion—holding criminal defendants’ liberty interests ransom while telling the state to pay up. This commercial practice may be regulated by the Federal Trade Commission (FTC), which already has an established Technology Enforcement Division dealing with antitrust violations in technology. In addition, associations of legal professionals such as the American Bar Association could potentially intervene by drafting standards for the use of risk assessment software in criminal processes that are more narrowly tailored to protect against racial bias. Moreover, where the government sees a problem that requires the heavy hand of command-and-control regulation, it has created new agencies to address the problem. For instance, the Consumer Financial Protection Bureau was created in 2010 to address widespread discrimination in lending practices. The challenges presented in regulating COMPAS are demonstrative of a more systemic problem with technology that is developed to perform traditional public functions, a marked resistance to public accountability. This may be grounds for establishing a new agency to keep technologies that perform a public function in line with public trust and constitutional values.


Northpointe, the original developer of COMPAS, has changed its name to Equivant, and it seems to have reconceptualized its marketing by taking the focus away from the fact that COMPAS is a racially biased algorithmic risk assessment tool by presenting itself as a company that develops “software for justice.” This re-marketing is the sort of soft change that we can expect from limiting our social regulation of these technologies to constitutional challenges in court. To ensure that functions traditionally assigned to the state do not fall into the hands of market-driven software developers, there needs to be more regulation by experts in law and technology prior to the induction of these products into our criminal legal system, where they will disproportionately affect the liberty interests of those most vulnerable.