Artificial intelligence-driven recruitment platforms discriminate against people who wear glasses or sit in front of bare walls, scientists have warned, urging companies to stop relying on “pseudoscientific” software.
Every company today uses algorithms to sift through large numbers of candidates to determine not only the right qualifications but also the best personality.
AI recruitment firms claim the programs bypass unconscious human bias and eliminate discrimination while focusing on people who are likely to be conscientious team players.
However, an analysis by Cambridge University revealed that they often discriminate against people on flimsy grounds such as home decor, dress or lighting.
In video interviews, AI programs tended to favor people sitting in front of bookshelves or with art on the wall. They also favored applicants with headscarves because they considered them less neurotic, while they rated those who wore glasses as less conscientious.
“All too often, the hiring process is lopsided and confusing,” says Euan Ong, an algorithm developer at Cambridge University.
“These tools are trained to predict personality based on shared patterns in images of people they’ve seen before, and often end up finding false correlations between personality and seemingly unrelated characteristics of the image.”
The researchers warned that AI is being used unscientifically to infer personality traits, such as head tilt, intonation and vocabulary, from tiny gestures.
Algorithm developers often claim that their programs can recognize the “Big 5” personality traits—Conscientious, Extraversion, Open, Neuroticism, and Agreeable.
But the researchers point out that people can show conscientiousness or agreeableness in different ways, and there is no universal interpretation of what someone means or intends with a gesture or phrase.
The team that published their findings in the journal philosophy and technologysaid using AI to assess personality is “automated pseudoscience” reminiscent of physiognomy or phrenology — the discredited belief that character traits can be inferred from facial features and skull shapes.
“We’re concerned that some vendors will wrap ‘snake oil’ products in shiny packaging and sell them to unsuspecting customers,” said co-author Dr. Eleanor Drage.
“While companies may not act in bad faith, there is little accountability for how these products are made or tested.
“As such, this technology and the way it is being marketed could become a dangerous source of misinformation about how to make hiring ‘unbiased and fairer.’
A 2020 study of 500 companies from various industries in five countries found that 24 percent of companies have implemented AI for recruitment purposes.
Another survey of 334 HR executives, conducted in April 2020 as the pandemic hit, found that 86 percent of companies are incorporating new virtual technologies into hiring practices.
Telegraph Media Group Limited 
https://www.independent.ie/world-news/how-your-decor-clothing-and-gestures-could-be-ruling-you-out-of-that-big-job-thanks-to-unscientific-bias-of-ai-42052938.html How your decor, your clothes and your gestures could exclude you from this big job thanks to the AI’s “unscientific” bias