Auditors are testing hiring algorithms for bias, but big questions remain

by
Weapon s of Math Destruction

ORCAA and HireVue focused their audit on one product: HireVue’s hiring assessments, which many companies use to evaluate recent college graduates. In this case, ORCAA didn’t evaluate the technical design of the tool itself. Instead, the company interviewed stakeholders (including a job applicant, an AI ethicist, and several nonprofits) about potential problems with the tools and gave HireVue recommendations for improving them. The final report is published on HireVue’s website but can only be read after signing a nondisclosure agreement.

Alex Engler, a fellow at the Brookings Institution who has studied AI hiring tools and who is familiar with both audits, believes Pymetrics’s is the better one: “There’s a big difference in the depths of the analysis that was enabled,” he says. But once again, neither audit addressed whether the products really help companies make better hiring choices. And both were funded by the companies being audited, which creates “a little bit of a risk of the auditor being influenced by the fact that this is a client,” says Kim.

For these reasons, critics say, voluntary audits aren’t enough. Data scientists and accountability experts are now pushing for broader regulation of AI hiring tools, as well as standards for auditing them.

Filling the gaps

Some of these measures are starting to pop up in the US. Back in 2019, Senators Cory Booker and Ron Wyden and Representative Yvette Clarke introduced the Algorithmic Accountability Act to make bias audits mandatory for any large companies using AI, though the bill has not been ratified.

Meanwhile, there’s some movement at the state level. The AI Video Interview Act in Illinois, which went into effect in January 2020, requires companies to tell candidates when they use AI in video interviews. Cities are taking action too—in Los Angeles, city council member Joe Buscaino proposed a fair hiring motion for automated systems in November.

The New York City bill in particular could serve as a model for cities and states nationwide. It would make annual audits mandatory for vendors of automated hiring tools. It would also require companies that use the tools to tell applicants which characteristics their system used to make a decision.

But the question of what those annual audits would actually look like remains open. For many experts, an audit along the lines of what Pymetrics did wouldn’t go very far in determining whether these systems discriminate, since that audit didn’t check for intersectionality or evaluate the tool’s ability to accurately measure the traits it claims to measure for people of different races and genders.