Preventing Discrepancies Between Indicated Algorithmic Certainty and Actual Performance: An Experimental Solution

Autor: Esther Borowski, Konstantin Zähl, Ingrid Isenhardt, Johanna M. Werz
Rok vydání: 2021
Předmět:
Zdroj: HCI International 2021-Posters ISBN: 9783030786410
HCI (38)
DOI: 10.1007/978-3-030-78642-7_77
Popis: Demands for transparency in algorithms and their processes increase as the usage of algorithmic support in human decision-making raises. At the same time, algorithm aversion – abandoning algorithmic advice after seeing an algorithm err – persist [1]. The current paper proposes a way to investigate the effect of transparency, i.e., disclosing an algorithm’s certainty about its future performance, on the usage of algorithms even when they err. A respective experimental setting requires varying algorithmic certainty while keeping the algorithm’s error rate constant. However, experiencing discrepancy between the certainty information and the actual performance could distort participants’ behavior. The paper, therefore, proposes a solution to the question: How can a study design prevent a discrepancy between indicated success rate and observable performance?
Databáze: OpenAIRE