Background and Objective: Visual impairment affects a significant part of the population worldwide. Glaucoma is one of these main causes, a chronic eye disease leading to progressive vision loss. Early glaucoma screening is an important task, allowing a slowing down of the pathology spreading and avoidance of irreversible vision damages. When manual assessment by experts suffers from disadvantages, exploiting the relevant Cup-to-Disc Ratio (CDR) feature as a structural indicator to assess the damage to the optic nerve head (ONH) is an efficient way for early glaucoma screening and diagnosis. Methods: In this paper, we propose a new fully automated methodology for glaucoma screening and diagnosis from retinal fundus images. In order to allow eye examination in remote locations with limited access to clinical facilities, we focus in this work on the development of a computationally-efficient algorithm for further implementation on mobile devices. First, the method provides a robust optic disc (OD) detection method, combining a brightness criterion and a template matching technique, to effectively detect the optic disc (OD) even in the presence of bright lesions associated to pathological cases. Second, an efficient optic cup (OC) and optic disc (OD) segmentation is performed, using a texture-based and model-based approach. Finally, Cup-to-Disc Ratio (CDR) computation leads to glaucoma screening with a classification between healthy and glaucomatous patients. Results: The proposed approach for glaucoma screening and diagnosis have been tested on the publicly available DRISHTI-GS1 dataset. Fifty retinal images are provided and labeled healthy or glaucomatous by trained specialists. The method achieves 98% of accuracy on final glaucoma screening and diagnosis, and excellent performance rates on evaluation metrics, outperforming the state-of-the-art CDR feature-based approaches. Conclusions: We proposed a fully automated method for glaucoma screening and diagnosis from retinal images. Excellent performance was obtained on final screening, classifying healthy and glaucomatous subjects. As it effectively detects the presence of glaucoma, in a low-computational manner, the approach can be part of a mobile help-diagnosis system, to improve the final diagnosis by the specialist and develop widespread visual health programs.
This work aimed at giving a comprehensive, indetailed and benchmark guide on the route to fine-tuning Convolutional Neural Networks (CNNs) for glaucoma screening. Transfer learning consists in a promising alternative to train CNNs from scratch, to avoid the huge data and resources requirements. After a thorough study of five state-of-the-art CNNs architectures, a complete and well-explained strategy for fine-tuning these networks is proposed, using hyperparameter grid-searching and two-phase training approach. Excellent performance is reached on model evaluation, with a 0.9772 AUROC validation rate, giving arise to reliable glaucoma diagnosis-help systems. Also, a baseline benchmark analysis is conducted, studying the models according to performance indices such as model complexity and size, AUROC density and inference time. This in-depth analysis allows a rigorous comparison between model characteristics, and is useful for giving practioners important trademarks for prospective applications and deployments.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.