Are you using artificial intelligence (AI) tools to make hiring and other employment decisions? Better tread carefully – or run the risk of violating federal anti-discrimination law.
Two federal agencies have issued important new guidance explaining how using AI and other software tools to make employment decisions can violate the Americans with Disabilities Act (ADA).
The Equal Employment Opportunity Commission (EEOC) and the Department of Justice (DOJ) separately released new technical assistance documents that explain how the use of these tools can result in unlawful discrimination against people with disabilities.
There are several ADA-related dangers lurking for employers using AI tools, the EEOC advises.
First, they may be on the hook if they do not provide reasonable accommodations when using AI tools.
Second, AI tools may improperly screen out applicants or employees with disabilities.
Third, AI tools that seek information about disabilities or medical conditions may violate the ADA rules regarding medical exams and inquiries.
The EEOC guidance, which is the more detailed and expansive of the two new documents, presents a series of questions and answers that are supplemented by practical tips for employers.
As to accommodation, it offers several examples that may be appropriate. These include a test that enables the test taker to input answers manually instead of by using a keyboard because they have a disability-related lack of manual dexterity. Other possible reasonable accommodations: extended time to take the test or an alternative version of it.
The duty to accommodate may still apply even if the test is administered by another entity, the EEOC adds. An employer that contracts with a software vendor to administer a pre-employment test on its behalf likely would be responsible if the vendor does not comply with all applicable requirements relating to the ADA, the EEOC says. Vendors should make the user interface accessible to as many individuals as possible, it advises.
As an example of an AI tool improperly screening out an individual with a disability, the EEOC guidance says video interviewing software that analyzes speech patterns may unfairly screen out an applicant with a speech impediment.
The ADA bans medical exams and inquiries before a conditional job offer is made, the guidance explains, Thus, AI questions that are likely to elicit information about a disability (and are asked before a job offer is extended) can violate the statute.
Not all AI tools that seek health-related information are prohibited by the ADA, the guidance says. For example, it says, a personality test may ask whether the individual taking the test would be described by friends as optimistic. The guidance notes that once employment has begun, medical exams and inquiries may be required if the ADA legally justifies them.
To stay compliant, the guidance offers some specific tips for employers.
- Train staff to recognize and process accommodation requests relating to testing.
- Use AI tools that have been designed to be accessible to individuals with disabilities.
- Make sure AI tools measure only abilities that are truly needed to do the job.
- Before an AI tool is used, provide all applicants and employees who will undergo the assessment with as much information about the tool as possible.
The DOJ guidance specifically focuses on the use of AI tools in the hiring process. It similarly warns against improperly screening out applicants with disabilities, and it echoes the EEOC’s points regarding the provision of accommodations.
As the use of AI tools grows, so does the importance for employers to understand how to utilize them without running afoul of applicable ADA requirements. This new guidance is required reading for any employer using AI tools to make employment decisions.