A Systematic Approach to Group Fairness in Automated Decision Making

Autor: Christoph Heitz, Corinna Hertweck
Přispěvatelé: University of Zurich
Rok vydání: 2021
Předmět:
Zdroj: 2021 8th Swiss Conference on Data Science (SDS).
DOI: 10.1109/sds51136.2021.00008
Popis: While the field of algorithmic fairness has brought forth many ways to measure and improve the fairness of machine learning models, these findings are still not widely used in practice. We suspect that one reason for this is that the field of algorithmic fairness came up with a lot of definitions of fairness, which are difficult to navigate. The goal of this paper is to provide data scientists with an accessible introduction to group fairness metrics and to give some insight into the philosophical reasoning for caring about these metrics. We will do this by considering in which sense socio-demographic groups are compared for making a statement on fairness.
Comment: Accepted full paper at SDS2021, the 8th Swiss Conference on Data Science
Databáze: OpenAIRE