Job seekers increasingly fear that AI hiring systems are screening them out before a human ever evaluates their applications. A new class-action lawsuit in California aims to enhance transparency in AI hiring tools, arguing that automated applicant scores should be regulated like credit checks under consumer protection laws. The case, filed by two STEM professionals, highlights frustrations with AI systems filtering out qualified candidates. Currently, approximately 88% of companies use AI for initial candidate screening, according to the World Economic Forum. Central to the lawsuit is Eightfold, an AI HR company that generates match scores from various data sources. The plaintiffs claim these scores constitute “consumer reports” under the Fair Credit Reporting Act (FCRA), warranting similar legal protections. Despite Eightfold’s denial of merit in the claims, the suit seeks compliance with consumer reporting laws and financial damages, stressing that qualified candidates are being unfairly denied opportunities based on unseen automated assessments.
Source link
Share
Read more