Detection of microscopic glaucoma through fundus images using deep transfer learning approach.
Autor: | Akbar S; Riphah College of Computing, Riphah International University, Faisalabad Campus, Faisalabad, Pakistan., Hassan SA; Riphah College of Computing, Riphah International University, Faisalabad Campus, Faisalabad, Pakistan., Shoukat A; Riphah College of Computing, Riphah International University, Faisalabad Campus, Faisalabad, Pakistan., Alyami J; Department of Diagnostic Radiology, Faculty of Applied Medical Sciences, King Abdulaziz University, Jeddah, 21589, Saudi Arabia.; Imaging Unit, King Fahd Medical Research Center, King Abdulaziz University, Jeddah, 21589, Saudi Arabia., Bahaj SA; MIS Department, College of Business Administration, Prince Sattam Bin Abdulaziz University, Alkharj, 11942, Saudi Arabia. |
---|---|
Jazyk: | angličtina |
Zdroj: | Microscopy research and technique [Microsc Res Tech] 2022 Jun; Vol. 85 (6), pp. 2259-2276. Date of Electronic Publication: 2022 Feb 15. |
DOI: | 10.1002/jemt.24083 |
Abstrakt: | Glaucoma disease in humans can lead to blindness if it progresses to the point where it affects the oculus' optic nerve head. It is not easily detected since there are no symptoms, but it can be detected using tonometry, ophthalmoscopy, and perimeter. However, advances in artificial intelligence approaches have permitted machine learning techniques to diagnose at an early stage. Numerous methods have been proposed using Machine Learning to diagnose glaucoma with different data sets and techniques but these are complex methods. Although, medical imaging instruments are used as glaucoma screening methods, fundus imaging specifically is the most used screening technique for glaucoma detection. This study presents a novel DenseNet and DarkNet combination to classify normal and glaucoma affected fundus image. These frameworks have been trained and tested on three data sets of high-resolution fundus (HRF), RIM 1, and ACRIMA. A total of 658 images have been used for healthy eyes and 612 images for glaucoma-affected eyes classification. It has also been observed that the fusion of DenseNet and DarkNet outperforms the two CNN networks and achieved 99.7% accuracy, 98.9% sensitivity, 100% specificity for the HRF database. In contrast, for the RIM1 database, 89.3% accuracy, 93.3% sensitivity, 88.46% specificity has been attained. Moreover, for the ACRIMA database, 99% accuracy, 100% sensitivity, 99% specificity has been achieved. Therefore, the proposed method is robust and efficient with less computational time and complexity compared to the literature available. (© 2022 Wiley Periodicals LLC.) |
Databáze: | MEDLINE |
Externí odkaz: |