Tuesday, December 23, 2025

The ATS Isn't Blocking You — Option B Is (And Mobley v. Workday Is About to Crack It Open)

Every few months, someone goes viral teaching you how to "beat the ATS." Same myths, recycled endlessly:

  • "The ATS rejected your résumé before a human saw it"
  • "Your formatting broke the system"
  • "Robots are filtering you out"

Let's clear something up.

The ATS is not the villain. Option B is.


Two Layers, One Scapegoat

Most people don't realize there are two layers in modern hiring systems:

Option A: The ATS (Applicant Tracking System)

This is what everyone blames. But here's what it actually does:

  • Stores your résumé
  • Tracks your application
  • Lets recruiters search and route candidates

It's a database with a workflow engine. It's not rejecting you because you used the wrong font.

I've applied through Workday hundreds of times. The ATS parses clean text just fine.

The ATS myth persists because it's comforting. It gives people a technical scapegoat.

Option B: The AI-Driven Screening Layer

This is what almost nobody talks about — and what actually creates risk.

Option B includes:

  • AI scoring and matching
  • AI ranking systems
  • Employer-configured filters
  • Pattern-based recommendations that learn from historical hiring data

This is where bias creeps in. Where age inference happens. Where disparate impact emerges.

And this is exactly what's being tested in Mobley v. Workday — not résumé formatting. Not templates. Not ATS parsing.


The Filter Nobody Talks About

Here's the part that should terrify every employer using these tools: Option B doesn't just filter for the requirements in the job posting. It filters for "candidates like the ones you hired before."

If your company has historically hired younger candidates, the AI doesn't need an age filter. It just learns the pattern — career length (15 years vs 25 years), job mobility (4 jobs in 10 years vs 2 jobs in 20 years), salary expectations, even LinkedIn activity — and recommends "similar" profiles.

That's not a bug. That's the feature. And it's exactly what Mobley is challenging.

Someone configured those filters. Someone decided what "qualified" looks like. Someone chose which patterns the AI should prioritize. The AI didn't go rogue. A human set the parameters. And when those parameters create discriminatory outcomes, "the algorithm did it" isn't a legal defense.


Why You Can't See Inside the Black Box

Can you request access to see how the AI scored you? Can you audit the historical data shaping current hiring trends?

Not unless you're part of a class action lawsuit.

The system is opaque by design. Vendors call it a trade secret. Employers call it proprietary. And candidates are left guessing — with no right to know, no right to appeal, and no way to prove the pattern without a legal war.

But Mobley is forcing the issue.

Once a judge orders discovery, "proprietary algorithm" becomes "produce the documents or face contempt." Workday can claim trade secrets all they want, but those don't override civil rights enforcement.


What Happens When the Black Box Cracks Open

Here's what should keep every HR tech vendor and every employer using these tools up at night:

If Workday refuses discovery and the judge finds for Mobley anyway — using adverse inference or default judgment — that's not just a loss for Workday.

That's legal precedent every company using Workday's AI just inherited.

Because now:

  • Other plaintiffs can cite Mobley in their cases
  • "We didn't know" stops being a defense
  • EEOC can target employers using Workday systematically
  • Employment liability insurance premiums skyrocket
  • Plaintiff attorneys have a blueprint to sue every other AI hiring vendor

You're on notice. You're liable. And ignorance isn't a defense anymore.

And here's the uncomfortable truth employers will have to face:
Option B didn't invent the bias. It inherited it. The AI learned from who you hired before, who you promoted, who you retained, who you called "culture fit." If the algorithm is biased, it's because the data is biased. And if the data is biased, it's because the organization is biased.

Discovery won't just reveal model weights and feature importance. It will reveal the hiring patterns you never wanted to admit you had. The black box isn't just the algorithm. It's your institutional memory — now documented, timestamped, and admissible in court.

The precedent won't just hit Workday. It'll cascade across every company that bought the tool and every vendor selling similar ones.


The Real Conversation

If we want to fix hiring, we need to stop obsessing over résumé templates and start talking about:

  • AI governance and transparency
  • Disparate impact testing
  • Vendor accountability
  • Mandatory adverse impact audits
  • Explainability requirements

The black box is about to crack open. And when it does, the entire AI hiring industry is going to have to answer for what their algorithms have been learning — and from whom.

That's not a legal risk. That's an industry-wide reckoning.

And it starts with discovery.