HR has embraced artificial intelligence (AI) across most major functions, according to Eightfold AI’s annual Talent Survey. For example, 73% of HR leaders reported that they used AI “to some extent” during the recruiting and hiring process.
But AI, like most new tools, also comes with a learning curve, as a new lawsuit shows.
Applicant files AI discrimination lawsuit
The complaint against Workday alleges the platform’s AI systems and screening tools discriminate against applicants based on race, age and disability. (As an FYI, Workday is a popular HCM platform used by many companies.)
Plaintiff Derek Mobley is a Black man who is over the age of 40 and suffers from anxiety and depression.
According to his complaint, Mobley has “applied for at least 80-100 positions” since 2018 at companies that used Workday as a screening tool during the hiring process. He claims he was “denied employment each and every time.”
He further alleges he was not hired due to “systemic discrimination” in the screening process and that Workday’s AI has a disparate impact on applicants who are Black, over the age or 40 and/or disabled.
The suit alleges Workday’s AI components discriminate against applicants based on:
- Race in violation of Title VII
- Age in violation of the Age Discrimination in Employment Act, and
- Disability in violation of the ADA Amendments Act of 2008.
The complaint seeks class-action status, proposing to represent applicants in each of the protected classes (i.e., applicants who are Black, over the age of 40 and/or have a disability) that applied for jobs at companies using Workday as part of the hiring process, but were not hired from June 3, 2019 to the present.
The suit also seeks back pay, front pay, damages and attorneys’ fees.
Mobley v. Workday, Inc., No. 23-cv-00770 (N.D. Cal. filed 2/21/23).
AI in the workplace
The use of AI in the workplace isn’t an inherently bad thing. In addition to using AI in hiring and recruiting, companies can also:
- use AI data to measure productivity, increase efficiency and streamline decision-making, and
- point to AI tools to help defend business decisions in court.
Learning curve: 3 lessons from EEOC hearing
And even though more companies are jumping on the AI bandwagon and recognizing its potential, we’re still learning how to use AI effectively.
For example, earlier this year, the EEOC held a public hearing focusing on the benefits and risks of using AI in employment decisions. Experts discussed ways to reduce AI-based discrimination, such as:
- removing questions that ask for data about protected classes
- conducting regular audits on algorithms, and
- allowing candidates to opt out of AI-based screening methods.
And last spring, the EEOC and the DOJ released guidance to help employers use AI tools in ways that comply with the ADA.
As an FYI, the EEOC enforces disability discrimination for employers in the private sector as well as federal employers. The DOJ enforces disability discrimination laws with respect to state and local government employees. Despite the distinction between the agencies, both guidance documents contain solid info.
For example, the EEOC guidance recommends that employers have a process in place to provide reasonable accommodations when using AI-based decision-making tools and warns that without such protections, workers with disabilities may be “screened out” of consideration for a job or promotion that they could perform, either with or without reasonable accommodation.
And the DOJ guidance provides a reminder that AI screening technology must “evaluate job skills, not disabilities.” It also warns that employers are not permitted to seek medical or disability-related information or conduct medical exams through the use of AI hiring tools.