Job Applicants Unite to Crack Open Mysterious AI Systems That Treats Resumes Like Bad Credit

Job Applicants Unite to Crack Open Mysterious AI Systems That Treats Resumes Like Bad Credit

Many people apply for jobs at lots of companies. The first step for them is often getting past an AI system that checks their résumés and decides if they are a good fit.

This process is a bit like how credit companies give people a number score based on their money and borrowing past.

Now, a group of job seekers has filed a lawsuit. They say some AI tools for screening job applications should follow the same rules as credit companies under the Fair Credit Reporting Act. The goal of the lawsuit is to make AI companies share more details about what info they collect on applicants and how they rank them.

The lawsuit is against a company called Eightfold AI. This company sells its tech to help employers save time and money. It uses sources like LinkedIn to build a big data set. Eightfold says this set includes more than “1 million job titles, 1 million skills, and the profiles of more than 1 billion people working in every job, profession, industry, and geography.”

When someone applies for a job, Eightfold's software looks at their skills and what the employer needs. Then it gives the applicant a score from one to five.

Job seekers say this tool can act like a gate that stops them from reaching a real human hiring manager. It gives no feedback on scores or how they were made. If the tool makes errors, people can't fix them.

“I think I deserve to know what’s being collected about me and shared with employers,” Erin Kistler, one of the plaintiffs in the lawsuit, said in an interview. “And they’re not giving me any feedback, so I can’t address the issues.”

Ms. Kistler has a computer science degree and many years of experience in tech. In the past year, she applied to thousands of jobs and kept careful track. Only 0.3 percent led to a follow-up or interview. Some of her applications went through Eightfold's system.

A spokesperson for Eightfold, which is based in Santa Clara, California, did not reply to requests for comment.

The lawsuit was filed in Contra Costa County Superior Court in California. It is one of the first efforts to challenge how AI is used in hiring. Employers and their lawyers think more challenges like this will come.



David J. Walton is a lawyer in Philadelphia who helps employers with AI issues. He is not part of this lawsuit. He said companies might argue that these tools are not the same as credit scoring. The hiring software could be seen as just sorting applicants into groups of good fits and not-so-good fits, like a human recruiter would do.

Still, Mr. Walton said that as companies use AI in new ways, they often work in unclear legal areas. This is especially true for data privacy and tech that might unfairly treat people, even if not meant to.

“These tools are designed to be biased. I mean, they’re designed to find a certain type of person,” he said. “So they are designed to be biased but they’re not designed to improperly be biased. And that’s a very fine line.”

Ms. Kistler's lawsuit was filed by Outten & Golden and Towards Justice, a nonprofit law firm in Denver. They got help from former lawyers at the Consumer Financial Protection Bureau and the Equal Employment Opportunity Commission. This case uses a new way to fight AI tech.

It is one of the first to use credit reporting laws to protect job applicants from "black box" decisions. This means decisions where the applicant doesn't know why they were turned down.

Congress passed the Fair Credit Reporting Act in 1970. This was soon after credit companies started using computers to collect personal info and turn it into scores. To protect people from mistakes in those records, the law requires companies to share the info with consumers and let them fix errors.

The law covers more than just credit. It defines a "consumer report" as any collection of info on someone's "personal characteristics" used to decide if they qualify for financial services or jobs.

“There is no A.I. exemption to our laws,” said David Seligman, executive director of Towards Justice. “Far too often, the business model of these companies is to roll out these new technologies, to wrap them in fancy new language, and ultimately to just violate peoples’ rights.”

Mr. Seligman thinks the law requires Eightfold and the companies using its tech to tell applicants what data is collected. It should also let them challenge and fix wrong info. The complaint seeks class-action status. It asks for money damages that aren't specified and an order for Eightfold to follow state and federal consumer reporting laws.

Other lawsuits have targeted AI systems for breaking anti-discrimination laws at the federal and state level. A big one from 2023 is against Workday in federal court in San Francisco. It says Workday's system, another common one for screening applicants, unfairly treats older people, those with disabilities, and Black applicants.

Judge Rita F. Lin turned down Workday's request to throw out the case. She found that the plaintiffs' evidence—like one applicant getting rejected at 1:50 a.m., less than an hour after applying—suggests the AI tools might reject people based on things like race, age, or disability, not just skills.

In May, she gave early approval for the case to go forward as a group action that could include millions of rejected applicants. A Workday spokesperson said the claims are not true. “Workday’s A.I. recruiting tools are not trained to use — or even identify — protected characteristics like race, age, or disability,” the company said in a statement.

In 2024, the Consumer Financial Protection Bureau put out a guidance note. It said files and scores made for hiring are covered by the Fair Credit Reporting Act. The companies that make them count as consumer reporting agencies under the law. These notes warn companies about how regulators will enforce the rules.

Under President Trump, the bureau changed its view. Russell T. Vought, the acting director—who has tried to weaken and shut down the agency—withdrew the guidance memo in May.

Lawsuits against companies usually take years. The case against Eightfold probably won't move quickly. But the main problems have been building for a long time.

Jenny Yang was chair of the Equal Employment Opportunity Commission during the Obama administration. She is one of the lawyers for the plaintiffs. She said the commission started looking at algorithmic hiring systems more than 10 years ago.

“We realized they were fundamentally changing how people were hired. People were getting rejected in the middle of the night and nobody knew why,” she said.

If you would like to show your support today you can do so by becoming a digital subscriber for as low as $1.00 a month. Doing so helps helps make Secret House possible and makes a real difference for our future.

Read more