Understanding hiring algorithms: Career coaching in 2019

Today, hiring technology vendors increasingly build predictive features into tools that are used throughout the hiring process.They rely on machine learning techniques, where computers detect patterns in existing data (called training data) to build models that forecast future outcomes in the form of different kinds of scores and rankings.”  Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias

The hiring process is changing faster than most realize. Automated tools that use machine learning to source, analyze, and rank candidates are already integrated into the hiring process, sometimes without candidates knowing it. As we move into 2019, the adoption of predictive hiring tools show no sign of slowing down.

Career coaches and university career services departments have a responsibility to understand these hiring algorithms and their impact on job seekers. They need to create new strategies and update career workshops to help job seekers navigate hiring algorithms. (spoiler alert: my career workshops cover this)

If you don’t work in tech, artificial intelligence and machine learning can seem like intimidating topics. Even more unhelpful, hiring algorithms are a bit of a black box. The transformation to an automated hiring process is happening behind the scenes. It’s hard to figure out which companies use this technology and exactly these tools work. It’s challenging to know which automated tools use questionable data or cement bias into the hiring process.

Luckily, there’s a new report out to help career coaches get up to speed on new HR technology and hiring algorithms. The report, Help Wanted: An Examination of Hiring Algorithms, Equity, and Bias,is produced by the nonprofit Upturn. Upturn’s mission is to promote “equity and justice in the design, governance, and use of digital technology.” Their mission shines through in their new report.

The report offers career coaches a comprehensive review of new hiring technology. The report is well-written, making it an accessible read. More importantly it isn’t filled with marketing promises that dominate HR tech press. Instead, it dives deep into the issues and impacts that hiring algorithms can have on the hiring process. The report covers the ways in which bias can be baked into hiring algorithms. It’s a refreshing piece of content in a sea of HR tech press that relentlessly praises new hiring technology as efficient and transformative, often ignoring the impact on the candidate.

Honestly I could fill this post with quotes from the report. But I’ll stick to these two bits from the executive summary. If you don’t have the time to read the report, at least read the executive summary. Then carve out 30 minutes to read the rest before 2019. Consider this your last professional development activity of 2018.

Hiring is rarely a single decision point, but rather a cumulative series of small decisions. Predictive technologies can play very different roles throughout the hiring funnel, from determining who sees job advertisements, to estimating an applicant’s performance, to forecasting a candidate’s salary requirements.

Hiring tools that assess, score, and rank jobseekers can overstate marginal or unimportant distinctions between similarly qualified candidates. In particular, rank-ordered lists and numerical scores may influence recruiters more than we realize, and not enough is known about how human recruiters act on predictive tools’ guidance.

After you read the report, go further, and read How to learn about ML/AI as a non tech person.