2 min read

Is Your AI Vendor Violating the Fair Credit Reporting Act?

Is Your AI Vendor Violating the Fair Credit Reporting Act?
Photo by BoliviaInteligente / Unsplash

Class action lawsuit raises questions about use of AI vendors in recruiting

A new California class action raises questions about the use of AI vendors in the recruiting and hiring process. Thank you to Martyn Redstone for bringing this case to my attention with his timely LinkedIn post. The case (copy attached below) alleges that AI recruiting services supplied by Eightfold AI violate the Fair Credit Reporting Act (“FRCA”) and its state law equivalents. The FRCA imposes various obligations upon background check companies and employers who use them. In this case, the Plaintiffs allege that the actions of Eightfold effectively turn it into a background check company, known as a Consumer Reporting Agency (“CRA”) under the FCRA.

When the FRCA applies:

If an employer uses a third party to gather information about a person for the purpose of deciding whether to consider them for employment, the FCRA applies to that employer. Any third party who supplies such information to employers can be a Consumer Reporting Agency (“CRA”), and use of such a third party will trigger an employer’s FCRA obligations. These obligations include specific statutory disclosures as well as allowing individuals to review and dispute the information obtained before an adverse employment decision is made. Failure to comply can result in substantial financial exposure, either to prospective employees in class actions, government agency fines, or both. And employers may be liable even when the initial non-compliance is that of the CRA.

Courts and the Federal Trade Commission interpret the FCRA to cover any evaluation of a person for a potential employment opportunity by a third party, even if:

• the person has not yet submitted a formal application
• the employer is simply deciding whom to invite to apply
• the employer is building a candidate pipeline
• the employer is screening large volumes of potential candidates

Key allegations in the Eightfold Complaint are as follows:

• The technology may be new, but the practice violates laws that have been on the books since the 1970s because it creates consumer reports to evaluate job applicants without complying with longstanding federal and California requirements. There is no AI-exemption to these laws, which have for decades been an essential tool in protecting job applicants from abuses by third parties—like background check companies—that profit by collecting information about and evaluating job applicants.
• Specifically, this class action arises from Eightfold’s unlawful practice of gathering, assembling, and evaluating information about job applicants through opaque machine learning processes and closely guarded algorithms, producing unreviewable reports that a growing number of employers rely on for employment decisions such as hiring.
• Eightfold generates these reports using AI-powered tools that assemble and evaluate information about prospective employees to determine their “suitability” that are purportedly based on factors like work history, projected future career trajectory, culture fit, and other personal characteristics Eightfold then sells these reports to employers for use in making employment decisions, which can have profound consequences for thousands of people across the country.

The takeaway for staffing and recruiting firms

This is a new frontier of developing law and will need to be followed as the law develops over time (sometimes years). In the meantime, consideration of this issue should be given when selecting any HR-related vendor that uses AI. If this theory succeeds, I would expect employers using the services of noncompliant vendors to be next on the target list.