Hiring Decisions Are Human Decisions: When It Comes to AI in Recruiting, Use Your Judgment

That's not a valid work email account. Please enter your work email (e.g. you@yourcompany.com)
Please enter your work email
(e.g. you@yourcompany.com)

camera

Recently, an application of artificial intelligence (AI) in recruiting received some attention, much of it negative. This particular application analyzes video interviews, judging — and rejecting — candidates based on facial movements, speaking voice, choice of words, and other factors.

While automation that helps recruiters source and evaluate talent is a welcome development, we must proceed with some caution. We need to draw clear lines between where AI helps us and where we need to assert our human judgment.

That said, the attention on this particular application threatens to obscure the more productive and positive AI innovations that are genuinely helping recruiters hire more top-tier talent.

Rather than relying on AI to make qualitative judgments, we can — and should — use AI primarily to free up human recruiters to apply their own judgment. Where AI for recruiting really shines is in the realms of sourcing, mitigating bias, and automating communications.

AI Should Help Candidates Stand Out, Not Fade Into the Background

Sourcing talent isn’t exactly like finding a needle in a haystack — it is more comparable to panning for gold. Locating the treasure means sifting through many seemingly identical grains of sand to find the perfect hire.

When it comes to resumes and candidate profiles, AI can do that sifting at scale, using preset parameters and key terms to surface positive matches from an otherwise unmanageable pile of candidates. By the time the pan gets to the recruiter’s hands, the gold has already been separated out, and the recruiter can move more quickly to make further assessments.

Video assessments can be useful in sourcing, but they only comprise one small piece of a rich set of AI features that can help staffing firms make positive matches. One practical application already in use is natural language search, which helps find best-fit candidates semantically by grouping like terms to allow recruiters to uncover candidates whose resumes and skills may not match exact search keywords. For example, a recruiter looking for a Java developer may want to see results that return related terms, such as J2EE, Java EE, or Jakarta EE, indicating specific skills within the original search. This kind of AI application serves the intent of a candidate search, rather than the letter.

AI Should Mitigate Bias Rather Than Reinforce It

One concern about video assessments is that, used improperly, they could reinforce bias rather than mitigate it. This possibility is under intense scrutiny in law enforcement circles, with a recent study  from the UK noting that algorithmic bias could lead to “discrimination on the grounds of protected characteristics” and “outcomes and processes which are systematically less fair to individuals within a particular group.”

This fear (along with privacy concerns) is likely part of what motivated the state of Illinois to pass The Artificial Intelligence Video Interview Act, which goes into effect January 1, 2020, and requires employers to provide notice, obtain consent, and provide an explanation of how AI works in the hiring process.

For more expert recruiting insights, check out the latest issue of Recruiter.com Magazine:

Rather than using AI to find reasons to disqualify candidates, recruiters would be better off using it to find ways for candidates to stand out through their skills, former roles, and other positive attributes. Applying AI in this way can help avoid the unconscious bias that even the most well-intentioned recruiters might bring into evaluating talent.

Algorithms that sort candidates without using indicators of race, gender, age, or other demographic factors will serve up the candidates who are often most detrimentally affected by biases. Even then, the humans implementing the AI need to be careful that they are not baking bias into the algorithm itself, in which case the effort is self-defeating.

This is not to say the video assessment referenced at the top of this article is abetting discriminatory hiring practices. Judging word choice and facial expressions may not necessarily cross that line, but any type of bias that removes qualified candidates from the pipeline can harm the recruiting process.

AI Should Automate Communications Without Cutting Recruiters Out

One of the key benefits of AI in recruiting is the chance to automate necessary administrative tasks recruiters must often perform in great quantity. No aspect of hiring is more ripe for this than communication.

Chatbots can engage and vet candidates initially. Recruiters can set up personalized emails, text messages, and other notifications to automatically update candidates at each stage of the recruiting process. With AI handling all of these communications, recruiters can use their newfound time to focus on the relationships they need to build to successfully find and attract the right talent.

Hiring decisions are human decisions. While AI can help recruiters make more of these decisions with better results at a faster rate, it cannot replace human hiring professionals entirely.

At its best, AI solves problems for recruiters, giving them more time to nurture relationships and exercise good judgment — and we must use that judgment to determine when technology is actually solving problems rather than simply exacerbating them.

Matt Fischer is president and CTO of Bullhorn.

By Matt Fischer