Addressing catastrophic forgetting for medical domain expansion

Autor: Michael F. Chiang, Ashwin Vaswani, Mehak Aggarwal, John Campbell, Jimmy S. Chen, Katharina Hoebel, Praveer Singh, Jayashree Kalpathy-Cramer, Nishanth Thumbavanam Arun, Ken Chang, Vibha Agarwal, Liangqiong Qu, Christopher P. Bridge, Daniel L. Rubin, Sharut Gupta, R. V. Paul Chan, Charles Lu, Mishka Gidwani, Jay M. Patel, Shruti Raghavan
Rok vydání: 2021
Předmět:
Popis: Model brittleness is a key concern when deploying deep learning models in real-world medical settings. A model that has high performance at one dataset may suffer a significant decline in performance when tested at on different datasets. While pooling datasets from multiple hospitals and re-training may provide a straightforward solution, it is often infeasible and may compromise patient privacy. An alternative approach is to fine-tune the model on subsequent datasets after training on the original dataset. Notably,this approach degrades model performance at the original datasets, a phenomenon known as catastrophic forgetting. In this paper, we develop an approach to address catastrophic forgetting based on elastic weight consolidation combined with modulation of batch normalization statistics under three scenarios: 1) for expanding the domain from one imaging system’s data to another imaging system’s 2) for expanding the domain from a large multi-hospital dataset to another single hospital dataset 3) for expanding the domain from dataset from one geographic region to a dataset from another geographic region. Focusing on the clinical uses cases of mammographic breast density detection and retinopathy of prematurity stage diagnosis, we show that our approach outperforms several other state-of-the-art approaches and provide theoretical justification for the efficacy of batch normalization modulation. The results of this study are generally applicable to the deployment of any clinical deep learning model which requires domain expansion.
Databáze: OpenAIRE