The EEOC and iTutorGroup, Inc. have reached an agreement to resolve the first AI discrimination lawsuit.
The EEOC’s complaint alleged the company’s hiring software automatically rejected older applicants in violation of the Age Discrimination in Employment Act (ADEA). Specifically, the lawsuit claimed the hiring program software rejected “female applicants age[d] 55 or older and male applicants age[d] 60 or older,” effectively screening out more than 200 applicants.
The problem was discovered when one applicant submitted two applications that were identical – with one exception, the suit alleged. The first application listed the woman’s real date of birth. On the second application, she entered a more recent date of birth. The applicant was contacted for an interview after she submitted the second application showing a younger age.
The suit sought back pay and damages for more than 200 applicants who were “denied jobs because of their age.”
The company agreed to pay $365,000 to be distributed to a group of applicants rejected on the basis of age. It also agreed to comprehensive injunctive relief. Among other things, the company:
- Is enjoined from screening applicants based on age
- Is enjoined from requesting dates of birth before a job offer is made
- Must provide four hour-long training sessions conducted by EEOC-approved third parties to all supervisory and management level employees focusing on the ADEA, Title VII and other federal EEO laws
- Must post a notice about employees’ rights
- Must review and revise anti-discrimination policies
- Must incorporate the updated policies into the employee handbook
- Must implement a complaint process for employees and applicants who wish to file a complaint, and
- Must submit to EEOC monitoring for the duration of the agreement.
Why is this settlement such a big deal?
1. Trailblazing settlement
At the risk of sounding obvious: This is the first-ever AI discrimination settlement. Clearly, AI has been on the EEOC’s radar.
As you may recall, the agency launched its Artificial Intelligence and Algorithmic Fairness Initiative in January. And in May, it issued a new resource that outlines important considerations when incorporating AI tools into employment decisions, which we covered as soon as it was released.
This settlement shows the agency is prioritizing AI-related discrimination and is committed to directing its enforcement resources to AI compliance.
“This case is an example of why the EEOC recently launched an Artificial Intelligence and Algorithmic Fairness Initiative,” the EEOC said in a press release announcing the settlement. “Workers facing discrimination from an employer’s use of technology can count on the EEOC to seek remedies.”
2. Steep AI learning curve
As new and improved tech emerges, many companies have recognized AI’s potential and jumped on the bandwagon, especially since productivity paranoia makes workers feel pressured to do more with less.
But there’s a learning curve for using AI effectively in compliance with employment laws, as this case shows. And it’s not the only one. In a similar case, Workday, a popular HCM platform, is facing a lawsuit alleging its AI screening system discriminates against Black applicants.
AI discrimination settlement: 3 takeaways specifically for HR
First things first: This is not about the general use of AI at work. Of course, you need to consider an AI policy. (Click here for a sample policy.)
Here, we’re focusing on key lessons specifically for HR pros who use (or plan to use) AI at work:
- Employers and HR pros must talk AI. Open communication is essential. The use of AI at work is exploding, but it’s still new technology, and employees are navigating the learning curve. The truth is, HR has questions about AI even as the department implements AI tools to streamline and automate specific tasks. Employees, especially HR, must be trained on the legal risks of inappropriate AI use.
- Conduct audits. When navigating the learning curve, it’s helpful to look at relevant legislation. In the case of AI, New York City passed the first law regulating the use of AI at work. And that law requires companies to arrange independent bias audits conducted by a third party. Even if you aren’t in NYC, the legislation provides insight into what’s considered a “best practice” under an evolving area of law. Want even more help? Check out the city’s final rule, which provides additional guidance for compliance with the law.
- Stay tuned to EEOC for more info. This area of employment law is in flux – and it likely will be for a while as we navigate this new technology. On the federal level, the EEOC is the best source to stay up to date on evolving AI guidance. And even if you find yourself stumbling a bit, you can show good-faith efforts by being aware of and trying to implement EEOC guidance.