Many credit recent developments in generative artificial intelligence (AI) tools like ChatGPT with the explosion of interest in artificial intelligence, however from a civil rights and economic justice perspective there has been considerable interest in how automated systems have been impacting people’s lives for years, be it in terms of tenant screening, criminal justice, surveillance, or other areas. In 2023, the Civil Rights and Social Justice (CRSJ) Section of the American Bar Association set out to better understand the impact of AI on low-income and other marginalized individuals and communities. After compiling a review of the literature, we developed and launched a survey, distributed to attorneys in various practice settings, which received nearly 200 responses. Here are a few high-level takeaways from the results that may inform how the ABA might better support its members and the clients they serve.
In short: many attorneys are familiar with the impact of automated systems on their clients but are not familiar with the details of how those systems work and are uncomfortable explaining them. In the employment context especially, many are not familiar with federal or state laws governing the use of automated systems and the majority of respondents were not comfortable advising clients on legal impacts. They note that these systems are not accessible to the public and are biased or at least use biased datasets. Survey respondents additionally noted that there is a lack of access to the technology required to benefit from automated systems (e.g. reliable broadband access), and that automated denials are often without any transparency as to why an applicant was rejected and without any opportunity to provide context for the derogatory data point that caused that result.