Loading...

Companies are now turning to AI to reduce human bias in recruitment

Companies are now turning to AI to reduce human bias in recruitment
Photo Credit: Photo Credit: Pixabay
Loading...

In order to reduce biases, a number of companies are now relying on artificial intelligence to make their hiring decisions, a report in Bloomberg stated.

San Francisco-based recruiting automation platform Entelo relies on machine learning to identify skills and match them with the appropriate candidates. The firm launched a tool called Unbiased Sourcing Mode to allow recruiters to on board new talent without biases. It blocks information about a candidate such as names, school, photos, employment gaps, age indicators and also replaces gender-specific pronouns, the report stated.

Another firm Stella.ai has trained its algorithm to choose candidates only based on the data it is provided such as skills, industry or tiers of companies, the firm’s founder Rich Joffe was quoted as telling Bloomberg.

Loading...

While Stella.ai and Entelo deal with external hires, CorpU, a leadership development platform, focuses on internal candidates. The firm has tied up with the University of Michigan’s Ross School of Business to develop a course to select high-potential employees, the Bloomberg report said.

“Identifying high-potential candidates is very subjective. People pick who they like based on unconscious biases,” said Alan Todd, CEO of CorpU.

Using AI to make hiring decisions is being seen as a substitute for employee referrals, which is the currently the most preferred method of recruitment for most companies. However, even machine learning algorithms are not free from bias as they can be influenced by those who programmed them. Algorithms can fail to assess candidates if a programmer does not map out certain qualities and skills associated with them into the machine, the report said.

Loading...

“If the examples you’re using to train the system fail to include certain types of people, then the model you develop might be really bad at assessing those people,” said Solon Borocas, an assistant professor studying fairness in machine learning at Cornell's Information Science department.

To reduce the effects of algorithmic bias, some companies employ people to monitor the machines, the Bloomberg report added.  


Sign up for Newsletter

Select your Newsletter frequency