Content area

Abstract

Domains such as law, healthcare, and public policy often involve highly consequential decisions which are predominantly made by human decision-makers. The growing availability of data pertaining to such decisions offers an unprecedented opportunity to develop machine learning models which can help humans in making better decisions. However, the applicability of machine learning to such scenarios is limited by certain fundamental challenges:

a) The data is selectively labeled i.e., we only observe the outcomes of the decisions made by human decision-makers and not the counterfactuals.

b) The data is prone to a variety of selection biases and confounding effects.

c) The successful adoption of the models that we develop depends on how well decision-makers can understand and trust their functionality, however, most of the existing machine learning models are primarily optimized for predictive accuracy and are not very interpretable.

In this dissertation, we develop novel computational frameworks which address the aforementioned challenges, thus, paving the way for large-scale deployment of machine learning models and algorithms to address problems of significant societal impact. We first discuss how to build interpretable predictive models and explanations of complex black box models which can be readily understood and consequently trusted by human decision-makers. We then outline novel evaluation strategies which allow us to reliably compare the quality of human and algorithmic decision-making while accounting for challenges such as selective labels and confounding effects. Lastly, we present approaches which can diagnose and characterize biases (systematic errors) in human decisions and algorithmic predictions.

Details

Title
Human-Centric Machine Learning: Enabling Machine Learning for High-Stakes Decision-Making
Author
Lakkaraju, Katyaini
Publication year
2018
Publisher
ProQuest Dissertations & Theses
ISBN
9798662539532
Source type
Dissertation or Thesis
Language of publication
English
ProQuest document ID
2434567602
Copyright
Database copyright ProQuest LLC; ProQuest does not claim copyright in the individual underlying works.