Translate to multiple languages

Subscribe to my Email updates

https://feedburner.google.com/fb/a/mailverify?uri=helgeScherlundelearning
Enjoy what you've read, make sure you subscribe to my Email Updates

Wednesday, October 28, 2020

MIT: Hiring algorithm design could impact candidate diversity, quality | Talent - HR Dive

An algorithm that takes into account details such as a unique college major may improve diverse candidate representation, researchers said by Ryan Golden, associate editor for HR Dive.

Dive Brief:

  • Employers may be able to improve the diversity and quality of job candidates by using hiring algorithms that value exploration, or that account for candidates' unique backgrounds and work histories, according to an August working paper published by the National Bureau of Economic Research.
  • Researchers built three resume screening algorithms for first-round interviews for high-paying jobs in industries often criticized for a lack of diversity, according to an emailed statement from the MIT Sloan School of Management. The third algorithm implemented an "upper confidence bound," or UCB, model that included "exploration bonuses" which account for details such as having an unusual college major, different geographies or unique work histories. These bonuses tend to be higher for groups of candidates who are underrepresented, the statement said.
  • Using the third algorithm, researchers more than doubled the share of candidates who were Black or Hispanic, whereas the first two algorithms — which used a typical "standardized learning" approach — actually decreased Black and Hispanic representation. But while all of the algorithms increased the share of applicants who were women compared to human recruiting, the UCB model selected fewer female candidates than the two standardized learning models.

  • Dive Insight:

    Employers are increasingly automating recruiting processes during the COVID-19 pandemic, according to sources who previously spoke to HR Dive. Virtual skill assessments, screening tools, on-demand interviews and simulations, in particular, have grown in adoption during this time.

    Yet it is algorithms, which many ad opted prior to the pandemic, that have been a specific source of controversy in the HR space in recent years. A notable incident occurred in 2018 when Amazon scrapped an artificial intelligence-based hiring tool that assigned job candidates scores after company officials determined it was biased against female candidates, Reuters reported.

    Some research has shown that such tools may be opaque about the ways in which they evaluate candidates. A 2019 Cornell University analysis of pre-employment algorithms found that vendors did not disclose how they defined terms such as "fairness" and "bias," despite some claiming that their algorithms were "fair."

    The design of a hiring algorithm can impact diversity outcomes, according to Danielle Li, an MIT professor and co-author of the working paper. "In our study, the supervised learning approach – which is commonly used by commercial vendors of machine learning based hiring tools – would improve hiring rates, but at the cost of virtually eliminating Black and Hispanic representation," Li said in the statement. "This underscores the importance of algorithmic design in labor market outcomes."As Li noted, however, the UCB model algorithm also selected fewer women than the supervised learning models. "Although there are fewer women in our data set, increases in female representation under the UCB model were blunted because men tend to be more heterogenous on other dimensions like geography, education, and race, leading them to receive higher exploration bonuses on average," Li said.

    There are other considerations for employers to note as they consider AI-based solutions. For example, a 2019 survey of adults by outsourcing company Yoh found 42% said that AI should not have a role in selecting a candidate that is hired for a position, and 22% objected to AI's use in screening resumes. Sources have also previously advised that, even if AI doesn't make the final call as far as which candidates are hired, it may still reject candidates for reasons that could be considered discriminatory.

    Read more...

    Source: HR Dive