Autor: |
Al-funjan, Amera W., Al Abboodi, Hanaa M., Hamza, Najlaa Abd, Abedi, Wafaa M. Salih, Abdullah, Alaa H. |
Předmět: |
|
Zdroj: |
International Journal of Intelligent Engineering & Systems; 2024, Vol. 17 Issue 4, p243-262, 20p |
Abstrakt: |
The field of ophthalmology offers great promise for improving patient care and outcomes via automated diagnosis of eye illnesses. Using the Squeeze-and-Excitation Network (SENet) with a Mobile-Net backbone, we describe a unique deep-learning method for automatic illness categorization from retinal images in this paper. Using a lightweight transfer learning model with dramatically decreased parameters, we aim to achieve high accuracy and robust performance in binary classification tasks, such as normal vs. cataract and normal vs. other disease classes. We create and assess a lightweight model using a sizable dataset of retinal pictures from the Ocular Disease Intelligent Recognition (ODIR) database. We achieve remarkable accuracy values that surpass 99.9% on both training and validation datasets for all classification tasks through extensive testing and validation. Our models demonstrate constant performance metrics and small loss values, highlighting their efficacy and dependability in automated illness detection. We also review our models' clinical applicability and possible influence in helping medical professionals with early illness diagnosis, treatment planning, and patient management. Our research constitutes a noteworthy progression in AI-based ocular illness diagnosis, providing a dependable and practically applicable structure for automated disease categorization from retinal images. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|