Can you recruit for diversity with machine learning? A provocative chat with HiringSolved

Jon Reed Profile picture for user jreed November 16, 2016
Sometimes PR pitches gone wrong turn right. This one did. A chat on AI recruiting trends turned into a provocative discussion on machine learning ethics. CEO Shon Burton made his case for how HiringSolved helps to recruit diverse candidates - but he didn't duck the difficult AI issues.

One of the best ways to get me into an interview is to tick me off. That's how this piece on AI and diversity began. I got a PR pitch on behalf of HiringSolved.

It was about the benefits of AI to recruiting - but little on the algorithmic discrimination we've covered on diginomica (“You’re not our kind of people” – why analytics and HR fail many good people).

So I asked if the CEO of HiringSolved, Shon Burton, would be up for a hard look at machine-assisted recruiting. He was - and it was on. Before we dig in: HiringSolved is focused on making it "faster and easier to find the right person for any job." They do this by enabling customers to search millions of profiles in "any profession, anywhere in the world."

This is done via their machine-learning-powered "people aggregator," which gathers data from across the web, and filters that information into candidate profiles.

HiringSolved is definitely in growth mode. They recently crossed the 66,000 active user threshold. They are a best-of-breed player in a market of HR cloud suites, powered by 20 employees in Phoenix and New York.

HiringSolved's role in recruiting - "a Google for talent"

At interview time, HiringSolved had just announced the beta launch of RAI, "the first artificial intelligence assistant for recruiting." The goal of RAI? Free up recruiters from the tedium of sourcing candidates.

I know a bit about that tedium. I got my start in enterprise software in 1995 running a recruiting desk, with the constant groan of fax machines in the background. Burton and I had a laugh about that. My gosh, change is accelerating. But: "You'd be surprised. There's still some fax machines out there,' Burton told me.

To address the diversity issue, I needed to understand HiringSolved's place in the hiring process. Burton:

The easiest way to describe what we do is "Google for talent". When we started the company, we thought: "What if we had access to all the same data that Google has, which is all the public web data? What if we wrote a software layer on top of that data, so that anything you put into the query, people came out? If you put in 'sous chef' or 'financial analyst or 'Javascript developer', instead of what Google does, which is tell you what those things sort are, what if we just brought out people?"

The next twist? Add internal hiring data:

Once we wrote the software, people started sharing data with us. It's not just public web data anymore. One of our newest approaches is to turn the software we've written inward on a company's internal data sources. That addresses a huge data problem.

The magnitude of candidate records here is big:

If you take a company that's been in business for a while, they may have five or ten million applicants. I'm talking about a bigger company like a Cisco... They'll have interacted with literally five to ten million people over the last twenty years.

The diversity imperative - where should a company start?

But how does diversity fit into the picture? For many companies, diverse hiring is now an imperative:

Diversity's a huge driver on our field today. It's a huge directive in a lot of these companies.

Burton believes companies must start by separating out policies and hiring procedures:

Policy defines what we ought to be doing... Procedure defines how we actually do that.

Procedures are where the platitudes are put to the test:

It's great to say, "We should be hiring diverse candidates"... That's wonderful. It's kind of easy, frankly. Everybody feels good about that one.

To make the procedures stick, you need a big financial investment. Burton cited Google as an example, which committed to spending $150 million a year on increasing diversity in 2015, planning to boost its females engineers up from 17 percent.

Using machines for diverse recruiting

If a company invests in diverse hiring, that starts with a recruiting initiative:

If we need to hire more than 17 percent females in the engineering core, the recruiting order comes down. The new procedure might be: "The next ten engineers that cross my desk - five of them better be female."

Once that order comes down, queue the machines:

That's where software comes in. A machine today, and again - this may be horrifying to people - but a machine today is better at figuring out if someone is male or female than a human.

Credit neural networks and pattern matching:

If you put 1,000 in front of each, the machine's going to be more accurate and orders of magnitude faster. You can start to say something like, "I only want to see female Javascripters." That's something that we do.

Burton's word of caution: "Some people think that that's discrimination, right? We're discriminating against all of the male Javascripters in that scenario."

Burton uses the phrase "touchy-feely" to describe their goal of easy UX search. It's designed for HR pros who get frustrated with strict search protocols and Boolean logic. They avoid the legal quagmire by surfacing all candidates:

Procedurally, if we're going to hire more women Javascripters, then that's exactly what we have to do. In our world, we try to be as touchy-feely as we can. We don't write it as a filter. If we say, "Show me all the JavaScripters in San Francisco." We say, "There's a million." Then you say, "Just show me the female ones." The number never changes. We say, "There's still a million JavaScripters, but we stack-ranked the feed. The females show on the top of the list."

The legalities of discrimination can be complex. But as Burton explains, HiringSolved isn't narrowing the applicant pool. These aren't formal applicants. These are candidate searches, where casting a wide net is the requirement:

For us, they're not applicants. If you're discriminating an applicant flow, then you're in trouble. If you're searching, it's actually the reverse. You have to show that you're searching for a diverse workforce. We're able to implement that.

The ethics of automated sorts and recruiting AI

Gender identification is one example of how machines can change recruiting. Burton issued a caution: we all need to grapple with how this tech is used. Thanks to open source, it won't just be large companies wrestling with AI:

As we continue to go down this path, I think the question becomes, "Should we?" Should we implement these things, and how should we do it? Certainly there's tons of ways that it can go wrong.

Burton doesn't think machines can displace humans from the recruiting process anytime soon:

It would be a horrible experience to have to talk to a machine in a recruiting context. To me, those are very human issues. If my wife has a pre-existing medical condition, that's a very personal thing. I don't want to talk to a machine about that.

AI is a design opportunity, mixed with ethical dangers:

It's an interesting time. It's kind of like the 1950's. "Hey, we can build nukes. Should we?" That's kind of where we're at right now. It's a fun time to be a product designer.

My take

Burton recognized that in the wrong hands, machine-powered tools like these can be used for harm as well as good:

Like any tool, it can be used for good or evil. That's the scary part.

I wouldn't trust an AI provider that didn't acknowledge their tools are double-edged. They require transparent use - and some type of regulatory oversight. Our legal system is struggling with the latter; see Facebook's recent about-face on discriminatory capabilities in ad targeting.

I talked with HiringSolved about competing as a best-of-breed in the era of HR cloud suites. They also sent me a short customer view from TrueBlue. Those are topics I can return to. For now, I'd like to keep the focus on the problem of diversity. HiringSolved thinks they have a piece of the AI recruiting puzzle. But when we step back, there are plenty of difficult questions yet to answer. I'll take that over AI sugar-coating anyday.

A grey colored placeholder image