Will black box algorithms be the reason you don’t get your next job?

A good example is today’s workplace, where hundreds of new AI technologies are already influencing hiring processes, often without proper testing or notice to candidates. New AI recruitment companies offer to analyze video interviews of job candidates so that employers can “compare” an applicant’s facial movements, vocabulary and body language with the expressions of their best employees. But with this technology comes the risk of invisibly embedding bias into the hiring system by choosing new hires simply because they mirror the old ones.

– Artificial Intelligence—With Very Real Biases

Beyond bias we should be asking serious questions about the data that these algorithms are based on: what data are they using to determine the connection between facial movements, vocabulary, and body language as predictors of job performance?

More from the article above:

“New systems are also being advertised that use AI to analyze young job applicants’ social media for signs of “excessive drinking” that could affect workplace performance. This is completely unscientific correlation thinking, which stigmatizes particular types of self-expression without any evidence that it detects real problems. Even worse, it normalizes the surveillance of job applicants without their knowledge before they get in the door.

How was this algorithm designed?

Algorithms are everywhere. They make decisions for us and most the time we don’t realize it. Remember the United story where the passenger was violently ripped out of his seat? The decision to remove that specific was the result of an algorithm.

As more algorithms shape our life we must ask questions like who’s designing these algorithms, what assumptions do these designers make, and what are the implications of those assumptions?

So I’m giving a huge shout out to the podcast 99% Design for their episode on how algorithms are designed.

The Age of the Algorithm

Featuring the author of Weapons of Math Destruction, the episode takes a look at the subjective data used for algorithms that determine recidivism rates and reject job applicants. The examples used and questioned raised in this episode should have us asking more questions about the people and companies designing the algorithms that run in the background of our online and offline lives.

“Algorithms … remain unaudited and unregulated, and it’s a problem when algorithms are basically black boxes. In many cases, they’re designed by private companies who sell them to other companies. The exact details of how they work are kept secret.”