A Systematic Approach to Group Fairness in Automated Decision Making
Autor: | Christoph Heitz, Corinna Hertweck |
---|---|
Přispěvatelé: | University of Zurich |
Rok vydání: | 2021 |
Předmět: |
Algorithmic fairness
FOS: Computer and information sciences Computer Science - Machine Learning 10009 Department of Informatics Computer science Statement (logic) 1702 Artificial Intelligence 170: Ethik 000 Computer science knowledge & systems 006: Spezielle Computerverfahren 1710 Information Systems Statistical parity Field (computer science) Separation Machine Learning (cs.LG) Computer Science - Computers and Society Computers and Society (cs.CY) 1706 Computer Science Applications 1802 Information Systems and Management Group fairness Group (mathematics) Management science Independence 1801 Decision Sciences (miscellaneous) Suspect Sufficiency |
Zdroj: | 2021 8th Swiss Conference on Data Science (SDS). |
DOI: | 10.1109/sds51136.2021.00008 |
Popis: | While the field of algorithmic fairness has brought forth many ways to measure and improve the fairness of machine learning models, these findings are still not widely used in practice. We suspect that one reason for this is that the field of algorithmic fairness came up with a lot of definitions of fairness, which are difficult to navigate. The goal of this paper is to provide data scientists with an accessible introduction to group fairness metrics and to give some insight into the philosophical reasoning for caring about these metrics. We will do this by considering in which sense socio-demographic groups are compared for making a statement on fairness. Comment: Accepted full paper at SDS2021, the 8th Swiss Conference on Data Science |
Databáze: | OpenAIRE |
Externí odkaz: |