DermX: An end-to-end framework for explainable automated dermatological diagnosis
Autor: | Raluca Jalaboi, Frederik Faye, Mauricio Orbes-Arteaga, Dan Jørgensen, Ole Winther, Alfiia Galimzianova |
---|---|
Rok vydání: | 2021 |
Předmět: |
FOS: Computer and information sciences
Computer Science - Machine Learning 92B20 92C50 68T45 Radiological and Ultrasound Technology Computer Vision and Pattern Recognition (cs.CV) Image and Video Processing (eess.IV) Computer Science - Computer Vision and Pattern Recognition Health Informatics Dermatology Electrical Engineering and Systems Science - Image and Video Processing Explainability Computer Graphics and Computer-Aided Design Machine Learning (cs.LG) FOS: Electrical engineering electronic engineering information engineering Radiology Nuclear Medicine and imaging Convolutional neural networks Computer Vision and Pattern Recognition Dataset |
Zdroj: | Jalaboi, R, Faye, F, Orbes-Arteaga, M, Jørgsen, D, Winther, O & Galimzianova, A 2023, ' DermX: An end-to-end framework for explainable automated dermatological diagnosis ', Medical Image Analysis, vol. 83, 102647 . https://doi.org/10.1016/j.media.2022.102647 Jalaboi, R, Faye, F, Orbes-Arteaga, M, Jørgensen, D, Winther, O & Galimzianova, A 2023, ' DermX : An end-to-end framework for explainable automated dermatological diagnosis ', Medical Image Analysis, vol. 83, 102647 . https://doi.org/10.1016/j.media.2022.102647 |
ISSN: | 1361-8423 |
DOI: | 10.1016/j.media.2022.102647 |
Popis: | Dermatological diagnosis automation is essential in addressing the high prevalence of skin diseases and critical shortage of dermatologists. Despite approaching expert-level diagnosis performance, convolutional neural network (ConvNet) adoption in clinical practice is impeded by their limited explainability, and by subjective, expensive explainability validations. We introduce DermX and DermX+, an end-to-end framework for explainable automated dermatological diagnosis. DermX is a clinically-inspired explainable dermatological diagnosis ConvNet, trained using DermXDB, a 554 image dataset annotated by eight dermatologists with diagnoses, supporting explanations, and explanation attention maps. DermX+ extends DermX with guided attention training for explanation attention maps. Both methods achieve near-expert diagnosis performance, with DermX, DermX+, and dermatologist F1 scores of 0.79, 0.79, and 0.87, respectively. We assess the explanation performance in terms of identification and localization by comparing model-selected with dermatologist-selected explanations, and gradient-weighted class-activation maps with dermatologist explanation maps, respectively. DermX obtained an identification F1 score of 0.77, while DermX+ obtained 0.79. The localization F1 score is 0.39 for DermX and 0.35 for DermX+. These results show that explainability does not necessarily come at the expense of predictive power, as our high-performance models provide expert-inspired explanations for their diagnoses without lowering their diagnosis performance. |
Databáze: | OpenAIRE |
Externí odkaz: |