Do you play netball, have an unusual name or have been made redundant due to the coronavirus pandemic? If the answer to any of these things is yes, you could face an uphill battle for securing a job over the next six months, especially if the company you’re applying to join uses artificial intelligence for recruitment.
Companies are using flawed historical data sets to train their AI, which means that women, Black people and people of colour could find themselves discriminated against before they’ve made it to the interview room. According to Frida Polli, a former academic neuroscientist at Harvard and MIT, and CEO of Pymetrics, AIs are akin to toddlers in that they learn from the humans around them. “They look at the world and say, ‘I’m gonna learn from that’,” she explains. “AIs are learning from the origins of bias – the human brain.”
Polli argues that companies do not audit their data before training the AI – or in real time when it’s live. “I’d say over 90 per cent of programmers are not auditing their data,” she continues. “Humans are perpetuating bias and are unchecked, resulting in unchecked algorithms.”
The coronavirus crisis has already cost hundreds of thousands of jobs. According to the Insolvency Service, employers were planning to make at least 139,000 redundancies in June. In July, out-of-work benefit claims reached 2.7 million, according to the Department of Work and Pensions; 45 per cent of these were a result of people losing their jobs during the pandemic. When furlough support ends in October, companies looking to hire staff could face a tsunami of applicants.
But if they use AI to lighten the load on HR departments, they risk simply transferring existing bias on a mass scale. Back in 2018, Amazon had to scrap a machine learning program it had been using to sieve through job applications because of the use of historical data. It turned out that because of the traditional hiring choices of the past, the AI did not like women. This could happen again if companies do not act quickly.
Job applicants during the pandemic could also be discriminated against for gaps in their CVs because of redundancies and career breaks, says Raluca Crisan, co-founder of AI bias analysing company ETIQ. “If an AI has a timeline feature, which evaluates the timescale of being at a job, people could be penalised due to redundancy or shorter times at job caused by Covid-19,” she explains. “The discrepancies in the data could mean that the top talent could be culled, resulting in smaller talent polls of candidates.”
AIs that also review keywords within CVs and base them on previous hires could single out demographic groups such as women or younger adults. “There are sports that are only played by women or men,” she says. “If this appears in the hobbies or soft skills section of your CV, the AI could remove you from the application process.”
While discussions are happening across all industries to attempt to close the gender gap and foster better representation of BAME demographics, author and political analyst Saurav Dutt says bias still exists in recruitment.
Born in Kolkata, India, and raised in the UK, Dutt has found that something as simple as his name can skew job opportunities. “The more foreign-sounding your name is, the greater the assumption that your English might not be to the same level as a ‘native’,” he explains. “Recruiters have looked for tell-tale clues to validate this assumption such as poor syntax, grammar, broken sentences – elements where a white person would be given the benefit of the doubt.”
Dutt also admits to “whitening” or “white-washing” his CV, changing his name to “Rav”, which he says has brought him more job opportunities, and removing anything linking to India or his religion, Hinduism. “It is more a case of actually including ‘whiter’ skills such as writing political columns, editing, golfing, tennis and running,” he explains. “I also included well-rounded languages such as German or French, as opposed to Bengali and Hindi, which are the languages associated with my upbringing.”
While some companies may not have yet implemented an AI solution for its recruitment drives, Deloitte’s technical director of privacy, Ivana Bartoletti, believes that the pandemic will mean more will do so at “rocket speed”, which she finds concerning.
“Transforming with AI is a complex process that requires checks and balances and proper involvement of employees and workers,” she explains. “In HR, issues related to algorithmic racism and inequality must be taken seriously and companies need to ask themselves what they need and why it is necessary.
“Automated decisions can lock people out of jobs and I am afraid current General Data Protection Regulation (GDPR) legislation is unfit to deal with the problems we face.”
More great stories from WIRED
😷 Life is now one big risk assessment. Here are five life rules for staying safe during a pandemic
🚓 Seven years on GTA V still refuses to die
💻 Putting data centres at the bottom of the ocean might actually be a good idea
🔊 Listen to The WIRED Podcast, the week in science, technology and culture, delivered every Friday
👉 Follow WIRED on Twitter, Instagram, Facebook and LinkedIn