The Very Real Dangers of AI — and How HR Tech Vendors Can Fight Them

That's not a valid work email account. Please enter your work email (e.g. you@yourcompany.com)
Please enter your work email
(e.g. you@yourcompany.com)

tech

After working with a lot of organizations in the recruitment artificial intelligence (AI) space, one thing I’ve realized is that HR and talent acquisition pros  are not necessarily welcoming these tools with open arms. In fact, many of them are actively against using AI for recruiting and hiring until some of their fears can be addressed.

One fear people often cite is the fear that AI will cost recruiters their jobs, but that isn’t really the top concern for most hiring professionals. Rather, these pros are mainly worried about a number of other, well-founded fears, which we’ll address below.

Bias in the Algorithm

Wait, wasn’t AI supposed to remove bias from the hiring process? In a perfect world, yes, but we don’t live in a perfect world. Who creates the algorithms that power AI? People do — which means all our preconceived notions about gender, race, and socioeconomic status can totally make it into the batter. Even one rotten egg can ruin a cake.

How to Fight This

Obviously, people aren’t going to disappear anytime soon. What we can do, however, is adopt a set of anti-bias recruiting-industry standards that apply to AI tools across the board. Organizations should also use the purchase of an AI tool as a chance to reevaluate their recruiting practices for any signs of bias.

Vendors of AI tools should partner with bias-busting services or platforms to ensure their algorithms are really neutral. They can also help lead the charge on creating ethical coding standards and codes of conduct for programmers and employers using AI.

Candidate FOMO

In a tight labor marke t, the fear of missing out (FOMO) on candidates is a thing. The very same candidate you shortlisted but did not choose last time around could be the candidate you would die to have today. However, AI’s strict algorithms could mean that degree requirements, felonies, drug tests, and skills tests weed out those candidates who could, with a little spit and polish, be great for the role.

How to Fight This

Start by revisiting your job requirements. Some are just plain dumb. A four-year degree is only truly necessary for a few white-collar jobs these days, so why put it there? If you have an AI-powered assessment filtering candidates, why the heck are you worried about unqualified folks applying? Get rid of conviction histories, drug tests, education requirements, and unrealistic experience levels as disqualifying criteria. I have only worked with one person over the last few years who didn’t have to cut some unnecessary requirements from his job ads — and he was recruiting eye surgeons. Are you?

Vendors, you don’t have to be just salespeople. At this stage in the game, AI recruitment platform sales are consultative anyway, so help your clients audit their processes and forever transform the way they hire.

The Same Old Thing, Over and Over

The way we did things before cannot be the way we do things now. When you grab a shiny new AI tool, it’s tempting to plug it into your existing processes and keep going. However, that’s a bad use of AI.

It’s time to stop using the same hiring practices we’ve always used simply because we’ve always used them. For many smaller companies, this is a no-brainer, but the longer an organization has been around, the harder it can be to overcome the inertia.

How to Fight This

Past practice is the bane of change management. Find yourself a champion at the top of the organization who can make real changes to the way you’ve always done things.

Many vendors aim to integrate with other, more commonly used systems, but if employers are going to be making changes, your new mandate is to win over the change-makers. Focus on creating content and processes that help organizations make the technology change, as your clients will have to make the people and process changes anyway.

It’s a Crutch

We can’t talk about AI without discussing what it means for the critically thinking mind of the recruiter, hiring manager, or HR professional. It makes no sense to use AI if your own recruitment team doesn’t understand the basics of its recruiting process.

One common vendor marketing tactic — one I’ve used myself — focuses on the fact that AI frees up recruiters to do more strategic work. If the recruiting team doesn’t even fully understand its recruiting processes, how can it possibly make smart strategic decisions about them? Put a different way, if every candidate is being engaged, screened, and ranked by a chatbot, what’s the point of a recruiter?

How to Fight This

This is on you, I’m afraid. There are lots of great resources for recruiters and sourcers who want to be better at their crafts. Social Talent and SourceCon Academy  are two off the top of my head, but there are plenty of others as well. There’s really no excuse not to invest in becoming the kind of recruiter who can smartly leverage AI instead of the kind of recruiter who just lets AI do what it will.

Vendors can support learning communities and create spaces for them to flourish. Sponsor a webinar, hold an online conference, or co-present with a customer who’s doing it right. There are lots of ways to support excellence in recruiting and sourcing.

Privacy and Control

Lots of job seekers who are interacting with chatbots today have no idea their information is being recorded during these conversations. It raises an important question: Are we obligated to disclose the data we collect via artificial intelligence to job seekers?

It’s a valid question, and one the General Data Protection Regulation (GDPR)  went some way in answering. However, I think we know we’re still not in the clear. For instance, some algorithms claim to predict candidates’ skills, intelligence levels, and personalities by scanning Facebook or LinkedIn profiles. Is this data under the job seeker’s control? What if the candidate is passive?

How to Fight This

Don’t use those tools unless the person is actively applying for a job at your company and you’ve informed them you are using said tools. Feel like that disclosure will adversely affect applications? Then don’t use the tools at all.

Vendors that provide these tools should consider whether they’re solving a problem or creating one. If it’s the former, make your case in your sales literature and educate your clients on the proper use of products that could infringe on the privacy rights of candidates. If it’s the latter — well, why?

AI-enabled recruiting tools don’t have to be the end of recruiting excellence or a privacy nightmare. In fact, they can help us do our jobs better, faster, and with more attention to candidates’ needs. Vendors and practitioners alike just have to be aware of the hurdles AI poses and plan for them.

A version of this article originally appeared on the Red Branch Media blog.

Maren Hogan is founder and CEO of Red Branch Media. You can read more of her work on Forbes, Business Insider, Entrepreneur, and her blog, Marenated.

By Maren Hogan