Imagine for a moment that you are a staff recruiter. You’ve looked at the CV of the candidate who’s just entered the room, and even though the interview hasn’t happened yet, you’re already bound to come to a conclusion based on your perception: you like him because of something he loves. If he said, it is the opposite for the same reason or it reflects well or bad on that person. There are more than 50 different biases in hiring will tip the balance one way or another, and inevitably if artificial intelligence (AI) is trained with the same parameters, they will be inherited by the tool.
The starting point of this work AI+EqualA project born in the field of innovation Contribute to the definition of an algorithm audit model that serves to ensure that they operate ethically and reliably. There are three companies behind the name: the company IN2, the CVA communication agency and the Observatory for Human Resources (ORH). The initiative will be developed over the next two years, until June 2025, and has received a subsidy of €999,370.91 from the Community of Madrid to make this happen, together with European funds from the Next Generation plan.
During this period, IA+Igual will develop and implement the project in three complementary phases. is coordinated by a multidisciplinary council It consists of experts ranging from history to psychology, trade unions and data protection experts to tune their results to the maximum.
In the first phase, piloted by IN2, where they are currently They will analyze the artificial intelligence tools currently used by 10 companiesTo create a theoretical framework, including some of the Ibex 35. Then, the inspections of these vehicles will come into play and ORH will create training content for human resources personnel.
Finally, a solution will be provided to the main axis of this technology: incorrect information. CVA will be responsible for communicating, informing and raising awareness about how algorithms designed to improve the field of human resources work, what benefits they have and how to manage them. The project will culminate in the preparation of a whitepaper containing certification model recommendations for algorithms.
Boss’s co-pilot
Although the idea I was born into the healthcare field five years agoThey now focus on human resources. “With good use of artificial intelligence, the human resources department can dedicate itself to generating value as a strategic partner of the business,” explains Maite Sáenz, managing partner of ORH.
And if it’s not used correctly, can lead to discrimination. “What scares me is when a group of engineers get together and create AI algorithms without taking the human factor into account,” says IN2 partner Félix Villar. At the end of the day, if this technology is boiled down to just its definition, it’s “a super advanced probability and statistics tool that learns from the experiences I can offer it.”
Related news
Right now, Some human capital managers are already using AI toolsfor example, to identify unwanted rotation or evaluate employee performance. When the maximum potential of this technology is reached, artificial intelligence will become, in Sáenz’s words, “the boss’s co-pilot in human management.”
But there are many more uses. In a context like the current one, where talent is lacking and also very variable, It is possible to use artificial intelligence in huge processes. If it were used in the public domain like SEPE, effective training programs could be created for the unemployed and could connect them with companies looking for their profiles through cross-reference data. “Our project is trying to open up another way of thinking, getting people in social services out of their comfort zones and getting them to think in a different way,” says CVA CEO Marisa Cruzado.