|Pymetrics uses an AI-based algorithm to remove bias in hiring and make the process more efficient / Credits: LeoWolfert via Shutterstock|
Today, hiring job candidates relies heavily on artificial intelligence as recruiters believe that it can eliminate bias. Reports show that the chances of getting the job for which a person had applied are systematically biased. For instance, according to the World Economic Forum, an independent international organization committed to improving the state of the world by engaging business, political, academic, and other leaders of society to shape global, regional, and industry agendas, African-American names in the US labor market are systematically discriminated against.
This is what tech startup Pymetrics wanted to address as it uses an AI-based algorithm to remove bias in hiring and make the process more efficient. Many companies such as McDonald’s, Accenture, and Unilever have already implemented Pymetrics as a part of their job application process. For instance, Unilever used its platform to assess more than 280,000 applications in 68 countries, reducing the hiring time by 75%.
According to Fortune, an American multinational business magazine, the startup’s algorithms not only scan a job candidate’s resume but also learn about their cognitive and behavioral traits that can determine if they are suited for the position. Frida Polli, CEO and co-founder of Pymetrics, stated that the process enables blind hiring, which means that it doesn’t collect any demographic information such as gender or ethnicity.
“We have to build algorithms and technology in a way that mitigates those [algorithmic] biases. But I think if we just throw out A.I., or we throw out technology as a solution altogether, we're really throwing away the baby with the bathwater,” Polli said.
However, some experts stated that AI tools like Pymetrics can accurately assess traits and avoid bias. Manish Raghavan, a Ph.D. candidate at Cornell University studying bias in machine learning, said that AI tools will still struggle to be significantly better. “[AI tools] are only as good as the data we give them. Ultimately, if we are feeding them data that we’ve created through our historically biased hiring processes, they are going to struggle to be significantly better than that without intervention and painstaking testing to make sure they're not incorporating our human biases,” he said.