Understanding algorithmic fairness for clinical prediction in terms of subgroup net benefit and health equity
Autor: | Benitez-Aurioles, Jose, Joules, Alice, Brusini, Irene, Peek, Niels, Sperrin, Matthew |
---|---|
Rok vydání: | 2024 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | There are concerns about the fairness of clinical prediction models. 'Fair' models are defined as those for which their performance or predictions are not inappropriately influenced by protected attributes such as ethnicity, gender, or socio-economic status. Researchers have raised concerns that current algorithmic fairness paradigms enforce strict egalitarianism in healthcare, levelling down the performance of models in higher-performing subgroups instead of improving it in lower-performing ones. We propose assessing the fairness of a prediction model by expanding the concept of net benefit, using it to quantify and compare the clinical impact of a model in different subgroups. We use this to explore how a model distributes benefit across a population, its impact on health inequalities, and its role in the achievement of health equity. We show how resource constraints might introduce necessary trade-offs between health equity and other objectives of healthcare systems. We showcase our proposed approach with the development of two clinical prediction models: 1) a prognostic type 2 diabetes model used by clinicians to enrol patients into a preventive care lifestyle intervention programme, and 2) a lung cancer screening algorithm used to allocate diagnostic scans across the population. This approach helps modellers better understand if a model upholds health equity by considering its performance in a clinical and social context. Comment: Main text is 19 pages, with 5 figures. Supplementary is 20 pages. Submitted to Epidemiology |
Databáze: | arXiv |
Externí odkaz: |