Investigations of Performance and Bias in Human-AI Teamwork in Hiring

Authors

  • Andi Peng Massachusetts Institute of Technology
  • Besmira Nushi Microsoft Research
  • Emre Kiciman Microsoft Research
  • Kori Inkpen Microsoft Research
  • Ece Kamar Microsoft Research

DOI:

https://doi.org/10.1609/aaai.v36i11.21468

Keywords:

Humans And AI (HAI), AI For Social Impact (AISI Track Papers Only)

Abstract

In AI-assisted decision-making, effective hybrid (human-AI) teamwork is not solely dependent on AI performance alone, but also on its impact on human decision-making. While prior work studies the effects of model accuracy on humans, we endeavour here to investigate the complex dynamics of how both a model's predictive performance and bias may transfer to humans in a recommendation-aided decision task. We consider the domain of ML-assisted hiring, where humans---operating in a constrained selection setting---can choose whether they wish to utilize a trained model's inferences to help select candidates from written biographies. We conduct a large-scale user study leveraging a re-created dataset of real bios from prior work, where humans predict the ground truth occupation of given candidates with and without the help of three different NLP classifiers (random, bag-of-words, and deep neural network). Our results demonstrate that while high-performance models significantly improve human performance in a hybrid setting, some models mitigate hybrid bias while others accentuate it. We examine these findings through the lens of decision conformity and observe that our model architecture choices have an impact on human-AI conformity and bias, motivating the explicit need to assess these complex dynamics prior to deployment.

Downloads

Published

2022-06-28

How to Cite

Peng, A., Nushi, B., Kiciman, E., Inkpen, K., & Kamar, E. (2022). Investigations of Performance and Bias in Human-AI Teamwork in Hiring. Proceedings of the AAAI Conference on Artificial Intelligence, 36(11), 12089-12097. https://doi.org/10.1609/aaai.v36i11.21468