I. Introduction
The ability to predict and prevent crimes sounds like science fiction. It is in fact the synopsis for the 2002 Sci-Fi movie The Minority Report, which was directed by Steven Spielberg and based on Philip K. Dick’s 1956 novella of the same name. In the movie, the audience is transported into the year 2054, where the federal government has implemented a system called “the Precrime Program” in Washington, D.C. This program, designed to respond to the high crime rates of the time, uses advanced technologies and three psychic humans, otherwise called precogs, who are essentially “pattern recognition filters” to predict future murders. A specialized police force is sent to preemptively apprehend the criminals and detain them for an unknown amount of time.
Although the Precrime Program is advertised as an infallible system because the precogs are thought to “never [be] wrong,” the audience soon discovers that they have been sold a lie when the precogs predict that the main protagonist, Precrime chief John Anderton is going to murder an unknown man. In trying to prove his innocence, Anderton discovers that the precogs can and sometimes have disagreed on the outcome of future events. These conflicting visions, called “minority reports,” are erased from the official records to preserve the system’s supposed credibility and dependability. Moreover, when Anderton is face-to-face with the man he will supposedly murder, he realizes that he has been set up and that the system can be manipulated. The audience, following Anderton’s discoveries, comes to understand that the promise of a crime-free world comes at the expense of innocents, and that, no matter how advanced technology becomes, it can still be defective and/or manipulated to the detriment of humanity. The increased use of surveillance systems spurred by the rapid technological advances and the privatization of services in the criminal justice system have created the perfect conditions for the plot of The Minority Report to become reality.
The use of electronic monitoring (EM) has increased in recent years in response to decarceration efforts, bail reform practices, the coronavirus disease of 2019 pandemic (COVID-19), and evolving technologies making such monitoring more efficient and cheaper. In the wake of the pandemic, concerns including overcrowding, unsanitary conditions, and institutional racism were at the forefront of conversations regarding mass incarceration. EM seems like a reasonable solution to address these issues, and it is advertised as a more humane and desirable alternative to incarceration because eligible justice-involved individuals can remain in their communities instead of going to jail or prison. However, this freedom comes with considerable unintended costs.
Technological advances have greatly widened the possible scope of surveillance, and monitoring agencies now have access to an unprecedented and increasing amount of sensitive data. This widening has not been accompanied by appropriate protective measures to limit the amount of data collected. At present, the National Institute of Justice (NIJ) is looking to extrapolate the use of EM and its reach into the users’ private lives by having Artificial Intelligence (AI) software analyze collected data to predict and prevent criminal behavior. The use of AI in this context is particularly problematic because EM devices have not only been reported as defective, but studies have also shown that African Americans are disproportionately impacted by EM. Relying on this type of faulty and biased technology to predict criminal behavior will likely lead to mistaken and discriminatory arrests. Moreover, the involvement of for-profit corporations in the EM market raises questions as to the provider’s motivations in providing these services. As a business, their financial interests may not align with the rehabilitative interests of the justice system, and, given the currently limited oversight into the providers’ modes of operations, abuses similar to those in The Minority Report are a possibility.
Because EM is almost exclusively privatized, the public contracts governing the use of EM must include standardized requirements to mitigate the risks and abuses identified in this Note. This Note will first describe the promises and downfalls of electronic monitoring. It will then highlight the shortcomings of the relationship between for-profit corporations and government entities with respect to EM, and it will analyze the ways in which this relationship increases the risk of unreasonable and arguably unlawful privacy rights violations. Finally, it will suggest that the United States adopt the provisions of the European Union General Data Protection Regulation to mitigate these risks. Adopting these provisions will ensure that EM of justice-involved individuals is accomplished in a way that fosters cooperation between private corporations and the government and appropriately regulates the management of private data.
II. Background
This section will provide an overview of the state of EM in the United States. It will first discuss EM’s origins and intended uses. Although EM is often thought of as a better alternative to incarceration, justice-involved people face unreasonable and unintended consequences from the largely unregulated use of EM by for-profit corporations. The law affords limited protections against abuses of this technology, allowing for nearly limitless surveillance, collection, and use of data.
A. The Pervasiveness of Electronic Monitoring
EM devices are becoming increasingly common in the criminal justice system. They are used to monitor people on pre-trial release, parole, and probation, as well as juveniles in the justice system and immigrants. According to a Pew Charitable Report from 2016, more than 125,000 people were actively being monitored in 2015, compared to 53,000 in 2005. This was likely an underrepresentation of the actual number of people under surveillance since Pew only accounted for those under pre-trial release, parole, or probation supervision, and not immigrants in government custody. Although the total number of people being monitored today is unknown, it has almost certainly grown exponentially since Pew’s report. For instance, in San Francisco, the number of people on monitors has tripled since a 2018 ruling forced courts to release more defendants without bail. Similarly, in Harris County, Texas, the number of people on pretrial monitors increased from twenty-seven in 2019 to over 4,000 in 2021. In Marion County, Indiana, in response to jail overcrowding, the use of EM devices doubled between 2015 and 2020. In Massachusetts, the number of people on EM devices almost tripled in eight years resulting in 4,100 people, whether on parole, probation, or pre-trial release, being monitored in 2020. The increase in the use of EM devices may be linked to a change in purpose behind its use.
B. The Promise of Electronic Monitoring
1. The Origin
EM devices originated in the 1960s and were spearheaded by two brothers: Robert and Ralph Schwitzgebel (known colloquially as the “Gable brothers”). While working for B.F. Skinner, an advocate of behaviorism, the Gable brothers created a device that could apply Skinner’s “reward-and-punishment” theory to juveniles in the criminal justice system. The device, worn on the user’s waistband, allowed for two-way communications through which positive and negative reinforcement could be sent. The brothers viewed the device as a “temporary measure” with the goal to correct “unwanted behavior.” Although the brothers obtained a patent for their technology in 1964, their idea for a behavior modification device was never widely embraced.
The promise of EM devices resurfaced in the late 1970s when Judge Jack Love of New Mexico stumbled upon two seemingly unrelated ideas: first, a Spider-Man comic strip where the hero tracks the villain’s movements via a wrist monitor, and second, an article about experimental radio transmitter implants in cows. Technology had sufficiently advanced for this idea to materialize, and technology entrepreneur Michael Goss developed the first ankle monitor in 1983 using radio frequency signals to restrict a user’s movement to a specific zone. Subsequently, Judge Love was the first to sentence three men charged with minor drug offenses to a month of electronic monitoring because he “didn’t want to send them to prison or let them off without penalty.”
The adoption of EM devices in the U.S. justice system, gradual at first, is now used by all fifty states, the federal government, and the District of Columbia. Its use, originally designed to modify and rehabilitate behavior is now an alternative form of punishment. The technology has been reappropriated, and, as Robert Gable stated, “It has come to be used primarily as a punishment or restraint device,” which he acknowledged “was not our intention.” Today, emerging technologies could once again transform the ways in which EM is utilized.
2. Crime-Fighting Tool?
a. Technological Advances
In Judge Love’s time, the use of monitors was limited by the technology available. The first EM devices relied on Radio Frequency Identification (RFID) technology that can only be used to restrict a user to a specific geographical area. The monitoring device is composed of two parts, a tag (the ankle monitor) and a reader (a radio device within the home). They were and are still used today, although significantly less, to confine a person to their home. Noncompliance with the conditions imposed is detected if the tag and reader are sufficiently apart from one another.
The use of RFID monitors has largely been displaced by the arrival of GPS technology. People are often most familiar with GPS-equipped ankle monitors. These devices can track people in real time without any location restrictions. They are also capable of defining “exclusion zones,” which prohibit users from accessing certain areas such as playgrounds, schools, or a victim’s home or place of work. In addition, some ankle monitors have other behavior tracking features such as audio recording or blood alcohol monitoring.
In recent years, smartphone surveillance applications have become increasingly common. These applications allow agents to track and communicate with those on monitors. Since the smartphone is not physically attached to the person in the same way an ankle monitor is, biometric identification technology is commonly used to ensure the identity and location of the user. Some examples include technology that can recognize a user’s fingerprints or voice.
Although GPS-equipped ankle monitors had some similar features, the growing use of smartphone applications has expanded the ways in which parole and probation is structured. For example, the application can remind users of upcoming court dates or drug tests, can perform virtual check-ins instead of in-person check-ins, and can allow offenders to request permission to move. The development of EM via smartphone applications has and will continue to increase the prevalence of EM for justice-involved individuals because it is convenient. A majority of people in the United States have smartphones, and it is believed to be less stigmatizing to monitor offenders via mobile application versus ankle monitor or some other body-attached device since it is less visible. However, because smartphones are “effectively mobile computers with immense processing power,” these applications also have the ability to access, collect, and use significantly more data than the older technologies.
b. The Introduction of Artificial Intelligence
In response to the growing use of EM, its net-widening effects, and overwhelming caseloads for correction officers, the NIJ is looking to use AI in the surveillance context. The hope is that machine learning technology can help “detect trends” and anticipate who is most at risk of offending or recidivating. Having access to data available through EM can help correction officers intervene more quickly and address the needs of justice-involved people and those in their communities. The NIJ’s project, which started in 2019, aims to develop AI tools that will improve EM and “(1) provide real-time [Risk-Need-Responsivity (“RNR”)] assessments; (2) promote intelligent offender tracking; and (3) enhance programming through mobile service delivery.”
First, using AI and EM devices, real time-RNR assessments could be obtained by monitoring biological data to assess how an offender’s mood and stress changes based on their environment. An increase in blood pressure, heart rate, stress levels, or body temperature, for example, could indicate that the individual is in a risky situation, and an alert could be sent to the community supervisor to intervene or investigate. Second, AI can also be used to enhance the location tracking ability of electronic monitoring devices. Indeed, it can identify how risk is evaluated across different geographical spaces and alert someone if a justice-involved individual enters a categorized “risky” zone. Finally, just like a person can have a conversation with a virtual assistant on their mobile device, AI as a rehabilitative tool can “engage with offenders, encouraging prosocial . . . behavior.”
Studies developing and testing the effectiveness of this technology are already underway. For example, in 2020, Perdue Polytech Institute, collaborating with Tippecanoe County Corrections, started to develop an “integrated smartphone and health-tracking device” called AI-based Support and Monitoring System (AI-SMS). To test its impact, researchers will monitor a sample of 250 Tippecanoe County offenders that have been paroled and fitted with the AI-SMS. The researchers will use AI and measure biometric data to determine what type of biological data correlates with risky behavior and how one might intervene to mitigate that risk. As of the preparation of this Note, the project is ongoing and results have yet to be released.
Despite laudable publicized goals of decreasing recidivism rates and helping justice-involved individuals reenter or remain in their communities, Mike Nellis, emeritus Professor of Criminal and Community Justice at the University of Strathclyde, warns that such talk is “little more than seductive, Silicon Valley tech-talk.” As an NIJ personnel makes clear, the primary objective of using AI for supervision is not rehabilitative but managerial. Moreover, even though these studies may be well-intentioned, they do not seem to consider or mitigate the known biases engendered by machine learning. Using AI in the surveillance context could replicate the discriminatory patterns previously found in face recognition technology and recidivism prediction software. Hence, since EM is dominated by private corporations and few regulations are in place, this technology is vulnerable to abuses and could further perpetuate discriminatory practices in the criminal justice system.
3. Arguments in Favor of Electronic Monitoring
EM is often thought of as a preferable alternative to incarceration, desirable by those subject to its use and those wanting prison reforms. Barry Latzer, a professor of criminal justice at the John Jay College of Criminal Justice, has argued in favor of increased use of EM devices, stating that “electronic monitoring can reduce incarceration rates while minimizing the risks to public safety—a win-win proposition.” This sentiment is unsurprisingly echoed by the leading suppliers of the devices on the market. One of the biggest providers, Sentinel, advertises the product as “the solution,” stating that it helps law enforcement be more efficient so they can focus on preventing crime instead of monitoring justice-involved people. Sentinel even promises that remote tracking “reduces recidivism rates, incarceration rates and therefore the need for additional prison capacity.” The risks accompanying broader use of EM, including notably the largely unregulated collection and use of sensitive data by these private corporations, are “the price someone pays to avoid incarceration.”
EM is also believed to be effective at saving costs. Compared to the daily costs associated with incarceration which range from $60 to $164, the daily costs of EM range from $3 to $10. However, this assumes that people on EM would otherwise be incarcerated. As researcher Emmett Sanders highlights, “often, people put on EM would be free if it didn’t exist.” Moreover, this perspective ignores how costs are saved. In every state except Hawaii, the cost of EM is passed onto those being monitored. Supervision fees can range from $1.50 per day to $47 per day. Depending on the amount of time that a person is subject to EM, these costs can become a considerable burden and jeopardize a justice-involved individual’s financial stability when they need it the most.
C. The Reality of Electronic Monitoring
1. No Protections Found Within the Law
Although EM devices have been used in the U.S. justice system since the 1980s, the courts do not offer much protection regarding sensitive information gathered by EM. Despite electronic monitoring allowing “prosecutors and law enforcement, with the click of a mouse, access to immense amounts of personal, otherwise private, information at any time of day without notice to the defendant,” courts often find that this alone is not a violation of the Fourth Amendment.
The Supreme Court has recognized the intrusiveness of location tracking, stating that it “provides an all-encompassing record of the holder’s whereabouts . . . and provides an intimate window into a person’ life, revealing not only his particular movements, but through them his familial, political, professional, religious, and sexual associations.” However, despite placing constraints on the use of electronic devices to surveil law-abiding citizens through cases such as Riley v. California, Carpenter v. United States, and United States v. Jones, the Supreme Court has not expanded the same protections to those on parole or probation.
Courts have placed very few limits on electronic searches for justice-involved people. For example, in Grady v. North Carolina, the Supreme Court held that a search under the Fourth Amendment occurred when GPS monitoring was used to continuously monitor those convicted of certain sex offenses. The Court remanded the case for the trial court to determine whether the search was reasonable. Aside from Grady, U.S. state and federal courts have either turned to the “reasonableness” doctrine or “consent” doctrine to justify warrantless electronic searches via EM. The former provides that, although the Fourth Amendment ordinarily requires “probable cause” for a constitutional search, a lesser degree of cause is acceptable when the balance of governmental and private interests makes such a standard reasonable. The latter stems from the fact that probationers and parolees either explicitly consent to electronic search conditions as part of their plea deal or parole/probation program, or implicitly consent to traditional search conditions, which include electronic searches.
2. Disparate Impact on Minorities
Consistent with the racial disparities evidenced by mass incarceration, the limited data available shows that African Americans are disproportionately subject to EM. For example, a 2020 report showed that in San Francisco, although six percent of the population are African American, they represent almost forty-four percent of the people on electronic monitors. Similarly, in Cook County, Illinois, “23% of the population is Black, but over 74% of those on electronic monitoring (and in jail) are Black.” A 2018 and 2019 study in Wayne County, Michigan, showed that African Americans were twice as likely to be put on monitors as white people. In Ohio in 2018, even though African Americans only made up twelve percent of the general population, approximately forty-two percent of those on monitors post-prison were African American.
The use of EM devices is perpetuating the oppression of African Americans and embodies what Professor Michelle Alexander calls “the newest Jim Crow.” This phrase refers to the use of seemingly progressive new technologies that, in reality, reproduce existing inequities and sustain the cycle of oppression. Testimony from individuals subject to EM echo this sentiment, such as J. Jhondi Harrell, who described EM as a “virtual slavery,” or Ernest Shepard, who described his experience as similar to being a “chattel slave.” Rayshard Brooks too found it impossible to lead his life, find work, or provide for his family when subject to EM, which made him feel like “an animal.” The ethicality of the widespread use of EM is called into question especially since the scientific community disagrees over the efficacy of EM at reducing recidivism.
3. Ineffective at Reducing Recidivism
Even though the use of EM devices is increasing and becoming the preferred alternative to incarceration, little information is known about them. It is difficult to know the exact number of people under surveillance by EM devices because no government-sponsored studies account for its prevalence or offer accountability for the millions of taxpayer dollars spent each year on its use. The most recent national study carried out by the Pew Research Center is already almost a decade old. With respect to smartphone apps, despite their potential reach into the private lives of justice-involved individuals, there is no known “external audit evaluating their monitoring mechanisms, their accuracy, or user impact.”
Moreover, although a driving force behind the proliferation of EM devices is the belief in its ability to reduce recidivism while simultaneously protecting the community, no conclusive data supports this narrative. Research on the success of EM in reducing recidivism rates is sparse and conflicting. For instance, in 2020, the University College London Department of Security and Crime Science, after analyzing seventeen studies designed to measure the effectiveness of EM, reported that “EM does not have a statistically significant effect on reducing re-offending.”
In contrast, a 2010 study funded by the NIJ and conducted by Florida State University Professor William Bales found that EM reduced recidivism rates by thirty-one percent for people on parole. The study’s results, however, suffered from a flawed methodology. In establishing a control group, Bales adjusted for an excessive 122 possible variables contributing to recidivism rates. Having this many variables makes it impossible to determine whether the decrease in recidivism rate reported in Bales’s study is the result of EM or the result of one of those 122 variables. Moreover, Bales did not include in these variable the factors believed to most frequently lead to recidivism, such as the inability to secure employment, housing insecurity, inadequate support network, or an overly strict parole or probation officer. Finally, whether the results of the study can be extrapolated to the general population is questionable since the study was based on data drawn solely from the Florida Department of Corrections records of 2007.
Another Florida-based study, conducted in 2017 by J.R. Regan using a post-incarceration population, found that the causal relationship between EM and recidivism was weak. This study also suffered from major flaws which render the results questionable at best. Indeed, J.R. Regan based his study on data collected more than ten years prior and, unlike Bales who accounted for too many variables, Regan accounted for none. Nevertheless, this lack of evidence has not stopped suppliers from guaranteeing EM’s success or government representative from endorsing its use.
4. Faulty Technology
The technology available today, whether GPS-equipped ankle monitors or smartphone application EM, is not infallible. There have been multiple reports of poor connectivity and false positive alerts. An analysis of the alerts received by Wisconsin’s GPS monitoring program revealed that the monitoring center lost GPS signals 56,853 times with 895 offenders during May 2017. In 2021, researchers in Chicago found that eighty percent of alerts were non-actionable or “false positives” triggered by factors such as low-quality signals or GPS drift. Shannan Davis, a justice-involved individual in Michigan, recounted to ACLU members the time “all three lights on the monitor were going off and it did that for two to three hours” and “[she] couldn’t get a hold of anybody.” It becomes disruptive in daily life and a source of fear, as a malfunction can lead to a rearrest.
The devices themselves are also unreliable. In an analysis of sixteen smartphone EM applications, users reported deficiencies with the apps such as inabilities to perform check-ins through the app or drainage of the phone’s battery power “causing the entire smartphone to crash or freeze.” Problems with facial recognition, voice recognition, or the location detection system, arguably the most important features of the app, were often to blame for inabilities in performing check-ins, violations of which could result in re-incarceration. One reviewer wrote in the Google Play Store: “The facial recognition needs to be refined since I didn’t have makeup on when I took the first pictures, however when I put on makeup, facial recognition becomes much harder, even in adequate lighting.”
Flawed facial recognition technology compounded by the fact that there are “significantly higher error rates on darker-skinned people” means that problems related to EM devices could disproportionately impact people of color. If the technology available today is regularly malfunctioning in a way that disproportionately targets people of color, how can we reasonably expect that the introduction of AI to predict crime will be accurate and fair?
III. Analysis
Private corporations are omnipresent in monitoring justice-involved people. Such involvement jeopardizes the goals of the justice system especially because of the inadequate ways that government agencies contract for these services. Government agencies have relinquished too much power over the ways in which private EM data is collected, stored, and used. As such, the government needs to take remedial measures and include in its EM contracts standardized requirements to protect the management of sensitive data.
A. A For-Profit Invention
Despite the lack of evidence suggesting that EM is effective, its growing popularity is driven by the efforts of private corporations that profit from its use. What started off in the 1980s with the privatization of prisons has now become a multi-billion-dollar industry that provides any and all services that might be useful to the prison industrial complex. These services range from food, education, and health care needs to “specially made niche items such as rubber pencils that can’t be used to stab someone.” Part of the success of for-profit companies in this space has been in their ability to predict the shift in public sentiment about criminal justice and diversify the services offered. In response to changing attitudes or concerns about mass incarceration, two of the largest private prison corporations, CoreCivic and GEO Group, acquired smaller entities whose focus are on alternatives to imprisonment. For instance, CoreCivic bought out the company Correctional Alternatives which specializes in reentry program. GEO Group acquired Behavioral Interventions, Inc. (BI) which focuses on providing EM services for those awaiting trial or on parole/probation. Private corporations have been able to maintain a strong foothold in the prison industry by efficiently predicting and responding to needs that arise.
Private corporations are indispensable to the practice of punitive surveillance, as no surveillance programs are entirely publicly run. Thus, private companies are usually involved in every step of the monitoring process. While privatization can facilitate efficiency and cost-savings, an innate conflict of interest exists between the profit-driven corporations and certain public goals of correction, particularly treatment and rehabilitation. For-profit corporations have no incentive to stop crime or curve incarceration rates. The more people arrested and imprisoned, the more profit they can make by monitoring those individuals awaiting trial or released on parole or probation.
Moreover, despite purporting to stay out of the conversation surrounding sentencing or detainment, CoreCivic and GEO Group carry out both federal and local lobbying efforts. In 2018, GEO spent $4.3 million on “expenditures to consultant government relations professions in direct lobbying.” Between 2006 and 2014, both CoreCivic and GEO spent at least $500,000 every four years on federal elections. They also spent $1.6 million to hire “revolving-door lobbyists”—former Capitol Hill workers—who could help them navigate the political field. Their efforts have certainly been profitable, as neither the Justice Is Not for Sale Act of 2021 or Private Prison Information Act of 2021—bills limiting for-profit corporation’s role in the carceral system—have garnered any votes in committee. It remains to be seen what will happen to the Fourth Amendment Is Not For Sale Act of 2023, which recently passed the House and would curb the government’s ability to obtain private records from EM vendors or the Private Prison Information Act of 2023.
The pervasive involvement of private companies in the EM market has resulted in an increase in the number of people under surveillance who otherwise may not be. Indeed, even though EM is typically assigned to people considered “high risk,” these companies are increasingly marketing their devices as effective for people assessed “low risk” as well. Moreover, although EM is advertised as an alternative to incarceration, “[t]here is no empirical evidence that monitoring is used [solely] as an alternative to incarceration.” Hence, some people involved in the justice system may be subject to monitoring even though there is no compelling need for it. As Chris Albin-Lacker, a senior legal adviser with Human Rights Watch explains, “There are a lot of judges who reflexively put people on monitors, without making much of a pretense of seriously weighing [the need for monitoring] at all.”
B. No Contractual Safeguards for Privacy
1. Types of Contracts
The private corporations’ diversification campaigns have successfully allowed them to insert themselves into the market for EM. They are involved in providing the equipment, collecting data, storing this data, supervising the people on parole/probation being monitored, and even “overseeing and approving schedule changes.” They are, in fact, indispensable as no publicly run surveillance programs operate without the support services of for-profit corporations. A study conducted by Kate Weisburd, a professor of law at the George Washington University, analyzed 247 records from 101 agencies in 44 different states and the District of Columbia, explaining the ways in which the government contracts with these private corporations.
From the record collected, four private corporations dominated the market: BI Inc., Attenti, Sattelite Tracking of People LLC, and Sentinel Offender Services LLC. Indeed, “[o]f the 76 contracts in this study, 64% involved one of these four companies.” These contracts lasted between one to three years, often with an option to renew, and involved several million dollars. They also typically included a monetary cap that private corporations could not exceed. While a monetary cap may be in place, most contacts that Weisburd reviewed did not set a fixed number of devices/services, meaning that the more the agencies used, the more they paid. This again might incentivize private corporations to push agencies toward purchasing more devices and widening the pool of people needing them. Less than half of the jurisdictions analyzed used cooperative purchasing agreements to contract for the EM. Cooperative purchasing agreements enable agencies to use the same contract as a base/template to which they can add addendums. Most agencies add little to no modifications indicating that their priorities are in saving costs rather than safeguarding the rights of justice-involved people.
The types of services or goods contracted for depends on the type of contract and jurisdiction. The study reveals that twenty-two of these states not only contracted with private corporations to obtain the monitoring devices, but also contracted out to monitor, collect, and maintain the data of those being monitored. The data collected is often shared through a web-based system that gives public agencies the opportunity to view the GPS location of those being monitored as well as past notifications and alerts. Sometimes, the public agency is informed second—after the private corporation—that a violation has occurred. Even in the instance where the agency is responsible for operating the program, the private company’s role is usually not limited to simply providing the devices, but also includes various services. For instance, even though the Indiana Department of Corrections supervises those on electronic monitors, it is only available during regular business hours to receive calls. As such, the private company is responsible for being the point of contact for the remaining sixteen hours a day.
2. Lack of Regulation as to the Use and Maintenance of Data
The ways in which location data is shared and with whom depends on the jurisdiction and what the contract provides. While some contracts allow for some location data to be shared with the public, others restrict its accessibility. For instance, the contract between the Los Angeles County Probation Department and the contractor Satellite Tracking of People LLC mandates that the data be accessible only to people that have a written authorization by the County Program Manager. In contrast, the agreement between the Denver Adult Probation Department and BI, Inc. provides that all adult GPS records be open to the public. This access includes open or closed cases, and no justification is needed to access the data.
Location data is sometimes used and shared for purposes other than community monitoring. Some agreements allow private companies to share the location data directly with law enforcement agencies for purposes of “crime scene correlation and other police investigations,” without notifying those being monitored. In North Carolina, for example, a justice-involved individual’s location data can be shared with law enforcement to see if it matches an area where a crime has been committed.
Moreover, the privacy policies of EM Android apps shows that data is sometimes being sold to third parties. An analysis of sixteen Android App policies reveals that, while five of them explicitly stated they would not sell the data collected, seven of the policies stated “that data will be used for marketing, sometimes for marketing the company’s own product and advertisements.”
Similarly, how the location data is stored and how long it can be maintained are unclear. None of the records analyzed by Weisburd told justice-involved people what would happen to the data collected, and of the jurisdictions using cooperative purchasing agreements, most of the contracts analyzed contained no provisions safeguarding the use of the private data. The few that did either allowed for the private companies to store the data for several years after the contract ended or required the data to be destroyed or returned to the agency. Moreover, because they are private corporations, they are not subject to public record laws. This makes it very difficult to determine how they operate. In fact, research already shows that the practices of these private corporations are unreasonably invasive because they collect data that is not necessary for them to carry out their contractual obligations.
3. Collection of Unnecessary Sensitive Data
A privacy-focused analysis of sixteen EM Android apps used by thousands of people in the United States revealed that these apps, designed principally to track peoples’ location, request access to data that is not necessary to fulfill their function. Typically, when a user downloads any app, they will have to accept permission requests to access data. For example, when using the weather app or the Google maps app on an iPhone, the app will request access to your location data to tell you what the weather is where you are or how to get from point A to point B. Similarly, people under EM app are required to accept most permissions requested by the apps. However, while each app is designed to serve the same function, some request more permissions than others, and some request “dangerous” permissions that “allow apps to access otherwise restricted data and take otherwise restricted actions.”
Most of the apps requested “ACCESS_FINE_LOCATION,” which enables access to someone’s precise location, “sometimes as accurate as within 10 feet.” Most of the apps also requested access to “CAMERA” and “RECORD_AUDIO” which reflects the apps’ ability to verify the user with biometric face and voice authentication or to perform check-ins with the EM supervisor. However, a few apps requested dangerous permissions such as “READ_PRECISE_PHONE_STATE,” which provides information as to the phone state; or “READ_CONTACTS” and “READ_PHONE_STATE,” which, when combined, provides information about who someone might be talking with and how frequently. Since other apps fulfill their EM functions without requesting dangerous permissions, it shows that they are not necessary and, in addition to being an invasion into a user’s privacy, they could potentially violate the privacy rights of those who interact with a user.
Not only do these apps collect sensitive data beyond what is reasonably necessary to fulfill their purpose, nearly all of them contained third-party libraries that can themselves access the sensitive data. Third-party libraries are pre-built pieces of code or software that can be used by app developers, and which make development easier and quicker. These third-party libraries present privacy problems because they can access the host app’s sensitive data. Two of these apps for instance used Facebook Analytics and Login third-party libraries that enable Facebook to access the user’s public profile and email address if they log into the platform through the app. Moreover, whether third-party libraries comply with the privacy policies outlined by the host app is out of the control of the app developers. Finally, depending on the quality of third-party libraries, they can be vulnerable to hackers.
4. Collateral Surveillance
In addition to unnecessarily and unreasonably invading the privacy of justice-involved people, electronic monitoring via ankle monitors—and to an even greater extent via smartphone application—can result in the invasion of bystanders’ privacy. With the audio and recording features of ankle monitors, conversations between the user and someone nearby could be overheard. For example, an ankle monitor developed by the corporation Track Group has the capacity to audio record those wearing the device and, even though it supposedly alerts users when recording starts, it can be turned on without notifying anyone. These monitors can thus infringe on the privacy rights of others who are simply in physical proximity to a justice-involved individual. EM via smartphone surveils an even wider net as “[a]nyone who texts, calls, or interacts via phone with a person who has a monitoring app installed is collaterally surveilled [and] anything they say could be stored in a database and used later for police investigation.” Video check-ins also risk monitoring those in the background if they enter into the field of the camera. Justice-involved individuals understandably have a reduced expectation of privacy, but that is not true of their friends, family, or simple bystanders.
IV. Looking at Europe to Find Protections for Our Data
Generally, data privacy rights for U.S. citizens are governed by statutory law. These privacy statutes are sectoral rather than universal, meaning that they only protect certain types of data, such as medical and financial data, or protect against unfair and/or deceptive trade practices that result in unreasonable breaches of consumer data. In contrast, the European Union (EU) has robust privacy laws that protect citizens who have an “absolute right” to privacy. The EU utilizes an “omnibus regulatory scheme,” which means all data is treated the same and afforded some level of protection. In light of the issues highlighted above, this section proposes that, in contracting for EM services with the private sector, governmental agencies should insert a Data Processing Agreement which reflects the privacy protections of the General Data Protection Regulations (GDPR).
A. The General Data Protection Regulations
In 2016, the EU Parliament finalized the GDPR which went into effect on May 25, 2018. The GDPR regulates the ways in which personal data is processed, stored, and moved, as well as its disposition between a processor and a controller. Personal data is defined as information that can be used to identify a person and includes data relating to a person’s name, location, biometric identifiers, and more. The GDPR distinguishes between two types of entities in possession of data: the data processors and the data controllers. An entity can be a controller and a processor simultaneously. Since specific provisions govern the duties and responsibilities of each, it is important to determine what type of data entity it is.
When a contract is formed between a controller and a processor, the controller must ensure that the contract includes a list of provisions that obligate the processor to the following: (1) process only the data specified by the controller, (2) ensure that the data remains confidential, (3) adopt security measures, (4) help the controller in its GDPR obligations, (5) be responsive when needing to delete or return data upon request, and (6) preserve information to show that they are in compliance.
The GDPR also imposes on a controller what it terms “privacy by design” but also “privacy by default.” Privacy by design requires organization to account for privacy when it develops a new product. As such, privacy considerations can no longer be an afterthought and cannot be sacrificed for the benefit of something else. Privacy by default requires controllers to implement measures that ensure that the minimum amount of personal data for a specific purpose is collected. It also limits the amount of time such personal information data can be kept.
The processor, among other requirements, has notification obligations and is required to implement appropriate security measures. In cases of breach, the processor needs to notify the controller without undue delay or risk severe penalties. A processor is also required to implement security measures including “pseudonymisation nd encryption of personal data” to ensure that personal information data remains confidential.
B. How to Apply the General Data Protection Regulations to the Use of EM in the United States Justice System
To safeguard the privacy of those being electronically monitored in the United States, and prevent potential abuses by for-profit corporations, the government (at the local, state, and/or federal levels) should adopt the GDPR provisions in public contracts for EM services. In adopting these provisions, it should be made clear who is the processor and who is the controller. In the context of electronic monitoring, a private corporation should be defined as a controller when it provides the EM devices. However, it should be labeled the processor when it serves as a monitoring agency who collects the data. The inverse will apply to the government agency.
As such, when the private corporation serves as a controller, the privacy by design and privacy by default provisions will force it to rethink the ways that it creates the devices so that they only collect information that is intimately related to monitoring. As a processor, meaning when it actively collects data, it will be contractually obligated to abide by the regulations prescribed by the GDPR. As a result, it will not be able to share private data to make a profit, nor will it be able to widen the net of individuals under its surveillance. In contrast, the government, as a controller—meaning when it purchases the devices will need to draft and update its contracts—ideally incentivizing the government to stop using cooperative purchasing agreements.
Such regulations as applied to EM are not out of reach or unreasonable. First, many organizations within the U.S. are already subject to the requirements of the GDPR if they “(i) process . . . the personal data of individuals located in the EU; (ii) offer goods or services to individuals located in the EU; or (iii) monitor behavior of individuals located in the EU.” Second, templates of Data Processing Agreements are available online and, as such, can easily be added to a contract. Finally, privacy laws modeled after the GDPR have already been established in the United States. After the GDPR was passed, California passed the California Consumer Privacy Act, which borrows similar characteristics from the GDPR.
V. Conclusion
Although EM offers some benefits, the ways in which it is currently operated in the United States unreasonably infringes on the privacy interests of those being monitored. The government (federal or local) plays a very limited role in its implementation. Instead, it relinquished its powers to private corporations to administer EM. This is concerning because, as a business, their interests may not align with the rehabilitative interests of the justice system. After contracting out these services, the government essentially becomes a passive observer and exercises limited oversight over the ways in which private corporations develop their devices, collect data, store data, and monitor users. Technological advances have greatly increased the amount of sensitive data which is accessible. As is the case here, without proper safeguards, data that is not necessary for the purposes of monitoring can be collected. Initiatives to use AI in the context of EM to predict criminal behavior should serve as a call for urgent intervention to safeguard the privacy of those being monitored because such initiatives could perpetrate discriminatory patterns and disproportionately harm African Americans. To prevent the plot of Minority Report from becoming a reality, the United States should adopt the EU’s GDPR. Such provisions would transform the EM landscape into a cooperative one between the government and private corporations and would ensure that those being monitored cannot have their data used against them.