Mobley v. Workday: A Wake-Up Call for AI Accountability in Hiring
In 2023, a groundbreaking lawsuit began challenging how artificial intelligence is used in hiring. Donald Mobley, a Black man over 40 with a disability, applied for more than 100 jobs and never received a single interview. He suspected the issue wasn’t his qualifications—but an invisible gatekeeper.
That gatekeeper was Workday, one of the most widely used HR software providers in the U.S. Its AI-driven tools promise to streamline hiring by automatically screening resumes, ranking candidates, and managing application flows. Mobley’s lawsuit alleges those tools did something far more harmful: systematically filtering out applicants like him before their resumes ever reached a human.
The case poses a critical question: Can a tech company be held legally responsible for discrimination if it isn’t the employer, but builds the tools employers use to decide? Traditionally, anti-discrimination laws like the Civil Rights Act, the ADA, and the ADEA target employers—not software providers. Mobley argues that Workday’s AI acted as the decision-maker, making the company liable.
In a major procedural step, the court allowed parts of the case to move forward. If Mobley wins, it could open the door to lawsuits against other platforms like Greenhouse, Lever, and iCIMS, and pressure lawmakers to demand more algorithmic transparency and fairness. Even if the case settles, it marks a turning point: courts are beginning to treat AI bias as a legal reality, not just a hypothetical risk.
Where the TJAAA Comes In
The Truth in Job Advertising and Accountability Act (TJAAA) is designed to close the very gaps in accountability and transparency that make cases like Mobley’s so difficult to adjudicate.
Here’s how the TJAAA would have changed the game:
Clear Third-Party Liability – The Act explicitly classifies job platforms and AI providers as “Posting Mediums” or “Authorized Representatives” when they post listings, collect candidate data, or participate in screening. This would eliminate ambiguity about whether companies like Workday can be held accountable.
Mandatory AI Disclosures – Employers and their representatives would have to notify applicants if AI tools screened them out, disclose the criteria used, and make those processes auditable for bias.
Proactive Oversight – The Department of Labor could investigate suspicious rejection patterns before lawsuits are needed, potentially preventing discrimination entirely.
By updating definitions, responsibilities, and reporting standards, the TJAAA modernizes outdated hiring laws for the AI era—making it harder for bias to hide inside “neutral” algorithms.
Why This Matters Now
Mobley v. Workday shows us the stakes: outdated laws written for human decision-makers don’t fit a world where machines increasingly control access to jobs. Without transparency, job seekers may never know why they were rejected—or even that a machine made the decision.
The TJAAA ensures that fairness, visibility, and accountability apply to all players in the hiring process, whether human or machine. It’s not just about protecting job seekers—it’s about restoring trust in the hiring market itself.
If we want to prevent the next Mobley case, we need to act now.
Read our full article here.