Autor: |
Naohiro Motozawa, Takuya Miura, Koji Ochiai, Midori Yamamoto, Takaaki Horinouchi, Taku Tsuzuki, Genki N. Kanda, Yosuke Ozawa, Akitaka Tsujikawa, Koichi Takahashi, Masayo Takahashi, Yasuo Kurimoto, Tadao Maeda, Michiko Mandai |
Jazyk: |
angličtina |
Rok vydání: |
2022 |
Předmět: |
|
Zdroj: |
Scientific Reports, Vol 12, Iss 1, Pp 1-11 (2022) |
Druh dokumentu: |
article |
ISSN: |
2045-2322 |
DOI: |
10.1038/s41598-022-05006-3 |
Popis: |
Abstract The retinal pigment epithelium (RPE) is essential for the survival and function of retinal photoreceptor cells. RPE dysfunction causes various retinal diseases including age-related macular degeneration (AMD). Clinical studies on ES/iPS cell-derived RPE transplantation for RPE dysfunction-triggered diseases are currently underway. Quantification of the diseased RPE area is important to evaluate disease progression or the therapeutic effect of RPE transplantation. However, there are no standard protocols. To address this issue, we developed a 2-step software that enables objective and efficient quantification of RPE-disease area changes by analyzing the early-phase hyperfluorescent area in fluorescein angiography (FA) images. We extracted the Abnormal region. This extraction was based on deep learning-based discrimination. We scored the binarized extracted area using an automated program. Our program’s performance for the same eye from the serial image captures was within 3.1 ± 7.8% error. In progressive AMD, the trend was consistent with human assessment, even when FA images from two different visits were compared. This method was applicable to quantifying RPE-disease area changes over time, evaluating iPSC-RPE transplantation images, and a disease other than AMD. Our program may contribute to the assessment of the clinical course of RPE-disease areas in routine clinics and reduce the workload of researchers. |
Databáze: |
Directory of Open Access Journals |
Externí odkaz: |
|
Nepřihlášeným uživatelům se plný text nezobrazuje |
K zobrazení výsledku je třeba se přihlásit.
|