Like most new tools, artificial intelligence (AI) comes with a learning curve.
And for HR pros learning to use AI, time is of the essence — because rookie mistakes can lead to expensive payouts. For example, in what’s believed to be the first settlement over alleged AI hiring bias, a company recently paid $365,000 to a group of applicants who claimed they weren’t hired due to age discrimination.
Now another lawsuit is putting AI hiring bias in the spotlight. The company – a popular HCM platform used by many companies – won the early round, but it’s not quite out of the woods yet. Here’s what HR needs to know.
Applicant claims AI hiring bias
Derek Mobley is a Black man who is over the age of 40 and suffers from anxiety and depression.
Mobley said he has “applied for at least 80-100 positions” since 2018 at companies that used Workday as a screening tool during the hiring process. He claimed he was “denied employment each and every time.”
He filed a lawsuit alleging Workday provided companies with algorithm-based applicant screening tools that discriminated against him and other similarly situated job applicants based on race, age and disability in violation of Title VII, the ADEA and the ADA, respectively.
Workday filed a motion to dismiss.
Can Workday be held liable?
The first hurdle Mobley had to clear was showing that Workday could be held liable as an employer.
At a hearing on Workday’s motion to dismiss, Mobley asserted three theories on which Workday could be held liable: 1) as an employment agency, 2) as an indirect employer, and/or 3) as an agent of the companies that use Workday’s screening tools.
However, Mobley only alleged the “employment agency” theory in his complaint, the court noted.
Was Workday an ‘employment agency’ here?
As to the “employment agency” theory, the court said the allegations were insufficient to state a valid claim. Specifically, Mobley failed to allege details about the application process – other than the fact that he applied through Workday and was unsuccessful in landing a role. The complaint also did not allege that Workday helped to recruit or select applicants, the court added.
Addressing the AI angle, the court noted that Mobley alleged Workday offered “discriminatory applicant screening tools to companies seeking to hire employees,” but he failed to allege whether Workday’s screening tools used AI algorithms “solely based on employer criteria, Workday’s own algorithm(s), or some combination of the two.”
As such, Mobley failed to allege Workday was “significantly involved enough in procuring employees to face liability as an ‘employment agency’,” the court explained.
Mobley then said he could add to the complaint to make up for the deficiencies.
The court granted Workday’s motion to dismiss but gave Mobley leave to amend.
What about those tacked-on theories?
Turning to the other two theories Mobley raised at the hearing, the court said the complaint didn’t contain sufficient allegations to allow the claims to proceed.
As to the “indirect employer” theory, the court explained what was missing to state a valid claim: The complaint “contains few factual allegations about Workday’s control over, participation in, or interference with the job application process.”
Regarding the “agency” theory, the court said the complaint fails to assert the companies delegated “control to Workday over the hiring process, as required for an agency theory of liability.”
It granted Mobley leave to amend these as well.
If Mobley successfully amends his complaint to show Workforce can be held liable under one of the mentioned theories, he must still include facts to support his discrimination and disparate impact claims.
What’s needed to prove his claims
The court outlined some of the things that were missing from Mobley’s complaint. As to the discrimination claim, the court said Mobley did not:
- allege that he disclosed any of his protected traits to Workday or that Workday knew about – or suspected – his protected traits, and
- plausibly allege he was qualified for the 80 to 100 jobs he unsuccessfully applied for: The complaint doesn’t include any info about the type of jobs, the requirements of the jobs and his qualifications for the jobs, outside of his education.
Without this info, it would be impossible to conclude whether Mobley was not hired due to his protected characteristics or some valid business reason, the court said.
Regarding the disparate impact claim, the court said Mobley’s complaint was missing two of three elements needed to state a valid claim.
First, plaintiffs alleging a disparate impact claim must show “a significant disparity with respect to employment for the protected group.” Here, Mobley relies on his own personal experience, and the allegations “are quite sparse,” the court said.
He alleged that he applied to 80 to 100 jobs, but he didn’t allege what kind of jobs they were, what his qualifications were and whether he was more successful applying to similar jobs that did not use Workday, the court pointed out.
Second, plaintiffs asserting a disparate impact claim must also show “a causal relationship between the identified practice and the disparity.” Here, Mobley did not “cite any data, scientific literature, or personal experiences to support the theory that [Workday’s screening tools] had a discriminatory impact on a specific protected group,” the court said.
Thus, it granted Workday’s motion to dismiss the claim. It granted Mobley leave to amend by Feb. 20. If he fails to file an amended complaint by that date, the clerk will close the case. We’ll keep you posted.
Early case offers lessons for HR
As companies continue to recognize AI’s potential and jump on the AI bandwagon, it will fall to HR to apply AI tools correctly when making employment decisions, like which applicants to hire.
Interestingly, in this case, the court wanted to:
- know whether the AI had been given info about the candidate’s protected classes
- get clarification on where the AI’s algorithm came from (the individual employers or Workday), and
- see research or data that supported the allegations that the AI discriminated against a protected class.
In the months, and probably years, to come, we will undoubtedly see more AI legislation addressing AI hiring bias – and other workplace concerns. In the meantime, federal agencies like the EEOC and the DOJ have provided guidance on AI in the workplace.
Among other things, the feds recommend:
- removing questions that ask for data about protected classes
- conducting regular audits on algorithms
- allowing candidates to opt out of AI-based screening methods, and
- ensuring that AI tools only measure abilities necessary for the job.
Mobley v. Workday Inc., No. 23-cv-00770-RFL, 2024 U.S. Dist. LEXIS 11573 (N.D. Cal. 1/19/24).