Between 2014 and 2017, e-commerce giant Amazon has tried using an Artificial Intelligence (AI) tool to rate the best job candidates as it does with the products on its online catalogs. It gives the applicants a rating of between one to five stars, as cited by Reuters.
One year into the deployment of the AI tool, the programmers came to the realization that their ‘holy grail’ recruitment too was very biased against women. The algorithmic system running the tool was not good enough at identifying the potential of all job candidates; while the female applicants consistently got lower ratings, their male counterparts were fairly assessed.
The problem of Garbage In, Garbage Out (GIGO)
Upon in-depth study on why the recruitment AI by Amazon was consistently churning out male employees while dismissing female job candidates reveals not a problem with the algorithm in and by itself. But in the data fed into the algorithm to help it distinguish between male and female employees.
An AI system learns from the data it has been given. To make an AI to distinguish between a rat and an elephant, it …read more