The use of artificial intelligence in recruitment is quickly gaining popularity as more companies absorb this technology. However, most of the companies rely on third-party service providers like LinkedIn and Monster to meet their hiring needs, where they are able to collect and screen resumes of millions of potential employees.
These platforms also utilize artificial intelligence to cater their services to several organizations. Most of the matching engines are optimized to create job applications. Recommendations of these engines are based on three types of data: data assigned to users based on other people with similar skills, information directly provided by users to the platform, and behavioral data, like the frequency of users replying to messages or interacting with job openings.
LinkedIn excluded a person’s name, age, gender, and ethnicity from their algorithm as they might result in a bias in the automated system. But John Jersin, former vice president of product management at LinkedIn and his team, found that even so, the service’s algorithm can still detect the behavior patterns of groups with specific gender identities.
Jersin said, “For example, you might recommend more senior positions to one group of people than another group, even if they have the same level of qualifications.” He further added that those people would not be exposed to the same opportunities, which is a matter of concern.
As a solution, Jersin and his team have developed another artificial intelligence that generates more representative results. It is entirely a different algorithm designed to counter biased recommendations towards certain groups. The new artificial intelligence makes sure that the recommendation system includes even distribution of transgender users before referencing the matches planned by the original engine.
The CEO of CareerBuilder, Irina Novoseelski, said that she intends to use the data gathered by the service to train the workers so that they can eliminate bias in recruitment information.