Validating racial and ethnic non-bias of artificial intelligence decision support for diagnostic breast ultrasound evaluation.
Autor: | Koo C; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Yang A; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Welch C; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Jadav V; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Posch L; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Thoreson N; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Morris D; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Chouhdry F; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Szabo J; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Mendelson D; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States., Margolies LR; Icahn School of Medicine at Mount Sinai, Mount Sinai Hospital, Diagnostic Molecular and Interventional Radiology, New York, New York, United States. |
---|---|
Jazyk: | angličtina |
Zdroj: | Journal of medical imaging (Bellingham, Wash.) [J Med Imaging (Bellingham)] 2023 Nov; Vol. 10 (6), pp. 061108. Date of Electronic Publication: 2023 Dec 12. |
DOI: | 10.1117/1.JMI.10.6.061108 |
Abstrakt: | Purpose: Breast ultrasound suffers from low positive predictive value and specificity. Artificial intelligence (AI) proposes to improve accuracy, reduce false negatives, reduce inter- and intra-observer variability and decrease the rate of benign biopsies. Perpetuating racial/ethnic disparities in healthcare and patient outcome is a potential risk when incorporating AI-based models into clinical practice; therefore, it is necessary to validate its non-bias before clinical use. Approach: Our retrospective review assesses whether our AI decision support (DS) system demonstrates racial/ethnic bias by evaluating its performance on 1810 biopsy proven cases from nine breast imaging facilities within our health system from January 1, 2018 to October 28, 2021. Patient age, gender, race/ethnicity, AI DS output, and pathology results were obtained. Results: Significant differences in breast pathology incidence were seen across different racial and ethnic groups. Stratified analysis showed that the difference in output by our AI DS system was due to underlying differences in pathology incidence for our specific cohort and did not demonstrate statistically significant bias in output among race/ethnic groups, suggesting similar effectiveness of our AI DS system among different races ( p > 0.05 for all). Conclusions: Our study shows promise that an AI DS system may serve as a valuable second opinion in the detection of breast cancer on diagnostic ultrasound without significant racial or ethnic bias. AI tools are not meant to replace the radiologist, but rather to aid in screening and diagnosis without perpetuating racial/ethnic disparities. (© 2023 Society of Photo-Optical Instrumentation Engineers (SPIE).) |
Databáze: | MEDLINE |
Externí odkaz: |