Advancements in technology have enabled companies to more efficiently analyze data for use in identifying and assessing options and making decisions. In the context of recruitment and hiring, companies are using artificial intelligence (AI) and machine learning technologies to improve applicant screening and selection outcomes. This includes the use of automated tools such as resume scanners, “chatbots,” and video interviewing software to review applicant resumes, measure skills and abilities, determine qualifications, or otherwise assess an applicant’s suitability for a position.
Legislative Initiatives
With these technological developments have come concerns that AI employment tools will result in discriminatory treatment of applicants. Some state and local governments have responded by enacting legislation to regulate how AI is used in making employment decisions. For example, the Illinois Artificial Intelligence Video Interview Act requires employers relying on an AI analysis of video interviews to report applicant demographic data. New York City’s newly enacted AI law makes it unlawful for an employer to use an AI tool to screen applicants unless a bias audit has been conducted to determine whether it has a disparate impact on individuals based on their protected status.
Federal Guidelines
The National Artificial Intelligence Initiative Act, which was enacted by Congress in 2020, created an advisory committee to provide recommendations on AI security, privacy, and civil-rights standards. In October 2021, the Equal Employment Opportunity Commission (EEOC) announced an initiative to ensure that AI and other emerging applicant-screening technologies conform to federal civil-rights laws. This includes developing guidelines and best practices on the use of AI in making employment decisions.
The EEOC and U.S. Department of Justice, Civil Rights Division (DOJ) each issued guidance in May 2022 on the use of AI and related technologies to assess job candidates. The agencies explain that an AI hiring tool that screens out an individual with a disability who can perform the essential functions a job with a reasonable accommodation may violate the Americans with Disabilities Act (ADA). The EEOC and DOJ therefore advise employers using AI tools to provide a reasonable accommodation to applicants who have a physical or mental condition that may make it more difficult to take a test or result in a less than favorable assessment. The agencies offer examples of practices that employers can implement to ensure that applicants receive needed accommodations:
- explaining the type of AI technology being used and how applicants will be evaluated
- notifying applicants that reasonable accommodations (including alternative testing modalities) are available to individuals with disabilities
- providing sufficient information to applicants so they can decide whether to seek an accommodation and implementing procedures for requesting an accommodation