A deep learning model for burn depth classification using ultrasound imaging
Autor: | Suvranu De, Conner Parsey, Sangrock Lee, Basiel Makled, Rahul, Jack Norfleet, James K. Lukan, Kateryna Zelenova, Tatiana Boyko |
---|---|
Rok vydání: | 2022 |
Předmět: |
FOS: Computer and information sciences
Swine Computer science Computer Vision and Pattern Recognition (cs.CV) Computer Science - Computer Vision and Pattern Recognition Biomedical Engineering Convolutional neural network Biomaterials Deep Learning Discriminative model Classifier (linguistics) FOS: Electrical engineering electronic engineering information engineering Animals Skin Ultrasonography Receiver operating characteristic Burn depth business.industry Deep learning Image and Video Processing (eess.IV) Ultrasound Pattern recognition Electrical Engineering and Systems Science - Image and Video Processing Mechanics of Materials Ultrasound imaging Neural Networks Computer Artificial intelligence Burns business |
Zdroj: | Journal of the Mechanical Behavior of Biomedical Materials. 125:104930 |
ISSN: | 1751-6161 |
Popis: | Identification of burn depth with sufficient accuracy is a challenging problem. This paper presents a deep convolutional neural network to classify burn depth based on altered tissue morphology of burned skin manifested as texture patterns in the ultrasound images. The network first learns a low-dimensional manifold of the unburned skin images using an encoder-decoder architecture that reconstructs it from ultrasound images of burned skin. The encoder is then re-trained to classify burn depths. The encoder-decoder network is trained using a dataset comprised of B-mode ultrasound images of unburned and burned ex vivo porcine skin samples. The classifier is developed using B-mode images of burned in situ skin samples obtained from freshly euthanized postmortem pigs. The performance metrics obtained from 20-fold cross-validation show that the model can identify deep-partial thickness burns, which is the most difficult to diagnose clinically, with 99% accuracy, 98% sensitivity, and 100% specificity. The diagnostic accuracy of the classifier is further illustrated by the high area under the curve values of 0.99 and 0.95, respectively, for the receiver operating characteristic and precision-recall curves. A post hoc explanation indicates that the classifier activates the discriminative textural features in the B-mode images for burn classification. The proposed model has the potential for clinical utility in assisting the clinical assessment of burn depths using a widely available clinical imaging device. |
Databáze: | OpenAIRE |
Externí odkaz: |