Who Took the Survey?
More than 66 percent of respondents work in a small or larger city, 11 percent work in multiple geographic settings, and 7 percent work in remote rural areas. In terms of clients served (respondents could select multiple options), 30 percent serve low-income individuals, 17 percent serve lower-middle-income individuals, 10 percent serve enterprises, and 96+ percent serve individuals from marginalized backgrounds (racial or ethnic minority, LGBTQ+, disability, immigrant or refugee, or other). More than 25 percent of respondents work in consumer law, housing, or public benefits, but some work in criminal justice, immigration, procurement, education, or employment as well.
What the Survey Takers Said
Approximately one-third of survey respondents indicated they are familiar with each of the following: automated surveillance, fraud detection, generative AI, and risk scoring, but more than 70 percent said they are at least slightly uncomfortable with explaining how any of the above systems worked. Nearly half indicated that they didn’t know when automated systems were even being used, but nearly 60 percent agreed that they knew how the systems impacted their clients. More than half indicated that they had not taken training or CLEs on automated systems and did not know where to find such training. More than half questioned whether automated systems were accurate, and a similar percentage felt the systems were biased. In terms of their own work, more than 60 percent did not change the way they work because of automated tools available to lawyers.
What the Results Mean
Although our 180+ survey respondents may not be a completely representative sample of the ABA or the profession more broadly, a few key things stand out to us: the ABA should offer training and CLEs on automated systems; should advocate for greater transparency and accuracy and less bias in automated systems; and should consider what it can do to ensure that marginalized communities such as those without reliable broadband access are not victimized by these systems without ever having a voice in their development or use, nor overlooked as more affluent communities celebrate the benefits of advances in technology despite those benefits not being universally enjoyed by everyone subject to the laws. CRSJ will be working to advance these goals within the ABA, with support from the ABA president’s newly minted Taskforce on AI (which includes focuses on risk mitigation and access to justice) and other groups.