|
Getting your Trinity Audio player ready...
|

A class action lawsuit against Pleasanton-based Workday is set to move forward in federal court after a judge denied the company’s motion to dismiss the complaint, which alleges bias and discrimination in its hiring software.
During a case management conference on July 12, U.S. District Judge Rita Lin ruled to grant in part and deny in part a motion for dismissal on the case filed by Workday, siding with the cloud-based enterprise software company in some regards but ultimately determining that Workday could be liable for discrimination in hiring induced by its platform.
“We’re pleased that the majority of claims in this case were dismissed, and we’re confident that the remaining allegations will be easily refuted as we move to the next phase where we’ll have an opportunity to directly challenge their accuracy,” a Workday spokesperson said in a statement.
While Lin granted Workday’s motion to dismiss allegations of intentional discrimination and agreed with their attorney’s argument that the company can’t be rightfully classified as an employment agency subject to federal laws on fair employment practices, she concluded that Workday could still be held liable for alleged discrimination by its customers’ usage of its hiring software when that hiring process might have otherwise been done manually.
In the initial complaint filed on Feb. 21, 2023, in the U.S. District Court for Northern California, attorneys for lead plaintiff Derek Mobley allege that Workday’s AI-driven hiring software replicates patterns of human discrimination and allows for employers to discriminate against job candidates via its platform.Â
The complaint argues that Mobley — described in court documents as a Black man over 40 years old diagnosed with anxiety and depression — was subject to racism, ageism and ableism via more than 100 job applications he submitted to employers using Workday’s hiring software, with the lawsuit seeking restitution in the form of modifications to Workday’s software and putting a halt on its use in the meantime in addition to monetary damages, on behalf of Mobley and “all others similarly situated.”
The subclasses’ attorneys for Mobley seek to represent include all Black applicants or former applicants who were not hired to jobs that used Workday’s hiring platform from June 3, 2019 to the present, as well as all applicants using the platform who are more than 40 years old and all applicants with disabilities.Â
“Mr. Mobley in the case at bar challenges systemic discrimination by, and seeks classwide relief against, Workday for its administration and dissemination of discriminatory screening products as part of its employment policies and procedures which constitute a pattern and practice of discrimination on the basis of race, age, and disability with respect to selections,” Mobley’s attorneys wrote.
“These policies and procedures have been continuously utilized by the Defendant since at least 2018, and their implementation and use has personally harmed the named Plaintiff, and the putative class members he seeks to represent,” they continued. “Moreover, the selection tools marketed by Workday to its customers allows these customers to manipulate and configure them in a discriminatory manner to recruit, hire, and onboard employees. Workday’s products process and interpret an applicant’s qualifications and recommend whether the applicant should be accepted or rejected.”
In their initial response to the complaint and an earlier motion to dismiss – which was granted fully in that case, but left room for Mobley’s attorneys to file an amended complaint – attorneys for Workday argued that the original complaint lacks factual allegations, including specifics of what kind of hiring software it offers, and how that software is developed and implemented in a discriminatory way, as well as failing to make the case that Workday itself can be held liable for the hiring practices of its customers, much less of intentional discrimination by the company.
“Even were the Court able to fashion a coherent challenged practice, Plaintiff provides no detail concerning how the challenged employment practice could possibly cause any unlawful disparate impact. Plaintiff says nothing of how Workday’s products work, what mechanism or mechanisms they use to ‘screen’ applicants,’ what inputs are relevant, and what outputs the products provide,” attorneys for Workday wrote in their response on July 17, 2023.
“He also says nothing of how Workday’s customers make use of whatever it is he is claiming Workday provides,” they continued. “Without this information, it is impossible for the Court to reach the conclusion that the challenged practice caused the alleged disparate impact.”
In her ruling on the initial motion to dismiss from Workday on Jan. 19, Lin said that while Mobley’s initial complaint could not withstand the arguments in Workday’s motion to dismiss, it would be possible to amend the complaint in such a way that it could move forward in the court.
“At the motion hearing and in his opposition brief, Mobley also identified two other potential legal bases for Workday’s liability: as an ‘indirect employer’ and as an ‘agent,'” Lin wrote in January. “Although the current version of the complaint does not plead those theories or facts supporting them, it appears that Mobley could potentially amend the complaint to do so.”
The amended complaint filed by Mobley’s attorneys includes more details about Mobley’s job application process at the root of the allegations, as well as information about how Workday’s hiring software is implemented, arguing that Workday is an agent liable for discriminatory practices stemming from the use of its software – given that the hiring process would otherwise be conducted by a human agent who could be held liable for discrimination under the same circumstances.
“Using their ‘AI,’ ‘ML’ assessments, tests, and pymetrics to make job recommendations (algorithmic decision-making tools) or control access to jobs (equitable or otherwise), makes Workday an agent for its client-employers,” Mobley’s attorneys wrote in the amended complaint. “Client-employers delegate to Workday certain aspects of the employers’ selection decisions as to Mobley and the putative Class Member.”
In last week’s hearing on the subsequent motion to dismiss the amended complaint filed by Workday’s attorneys, Lin pointed to a “gap” in existing case law for the “agency” argument being made in Mobley’s amended complaint.
“Accepting Workday’s argument, and consequently ignoring this gap, would allow companies to escape liability for hiring decisions by saying that function has been handed over to someone else (or here, artificial intelligence),” Lin wrote in her order on Workday’s second motion to dismiss on July 12.
Lin went on to assert that should the agency argument hold, Workday would be subject to the same anti-discrimination laws that recruiters or human-driven hiring agencies would be.
“Workday’s role in the hiring process is no less significant because it allegedly happens through artificial intelligence rather than a live human being who is sitting in an office going through resumes manually to decide which to reject,” Lin wrote. “Nothing in the language of the federal anti-discrimination statutes or the case law interpreting those statutes distinguishes between delegating functions to an automated agent versus a live human one.”
Lin ordered that Mobley’s attorneys would be allowed to file a second amended complaint within three weeks of the July 12 order, scheduling a virtual case management conference for Aug. 7 at 10 a.m. and a July 31 deadline for case management statements by both parties.





AI algorithms learn from historical data, which may contain biases. If the training data reflects existing inequalities (such as gender and race), the AI system might perpetuate these biases when evaluating job applicants.
AI models consider various features (such as education, work experience, and skills) to make hiring decisions. If certain features are unfairly weighted or if irrelevant features are included, discrimination can occur.
The design of the algorithm itself can introduce bias. For example, if an AI system favors certain qualifications over others without valid reasons, it can lead to discrimitory outcomes.
Some AI models are complex and lack transparency. When decisions are made without clear explanations, it becomes challenging to identify and address discrimitory patterns. If an AI system is used in real world hiring, it’s decisions can influence future data if biased hiring decisions are used as feedback to train the AI system, it can reinforce existing biases.
AI cannot police itself. AI can make itself untouchable. The (EEOC) has launched an initiative to guide employers, employees, and job applicants in using AI fairly and consistent with equal employment opportunities.