Background/aimsFundus fluorescein angiography (FFA) is an important technique to evaluate diabetic retinopathy (DR) and other retinal diseases. The interpretation of FFA images is complex and time-consuming, and the ability of diagnosis is uneven among different ophthalmologists. The aim of the study is to develop a clinically usable multilevel classification deep learning model for FFA images, including prediagnosis assessment and lesion classification.MethodsA total of 15 599 FFA images of 1558 eyes from 845 patients diagnosed with DR were collected and annotated. Three convolutional neural network (CNN) models were trained to generate the label of image quality, location, laterality of eye, phase and five lesions. Performance of the models was evaluated by accuracy, F-1 score, the area under the curve and human-machine comparison. The images with false positive and false negative results were analysed in detail.ResultsCompared with LeNet-5 and VGG16, ResNet18 got the best result, achieving an accuracy of 80.79%–93.34% for prediagnosis assessment and an accuracy of 63.67%–88.88% for lesion detection. The human-machine comparison showed that the CNN had similar accuracy with junior ophthalmologists. The false positive and false negative analysis indicated a direction of improvement.ConclusionThis is the first study to do automated standardised labelling on FFA images. Our model is able to be applied in clinical practice, and will make great contributions to the development of intelligent diagnosis of FFA images.
Background and aim
Eyelid position and contour abnormality could lead to various diseases, such as blepharoptosis, which is a common eyelid disease. Accurate assessment of eyelid morphology is important in the management of blepharoptosis. We aimed to proposed a novel deep learning-based image analysis to automatically measure eyelid morphological properties before and after blepharoptosis surgery.
Methods
This study included 135 ptotic eyes of 103 patients who underwent blepharoptosis surgery. Facial photographs were taken preoperatively and postoperatively. Margin reflex distance (MRD) 1 and 2 of the operated eyes were manually measured by a senior surgeon. Multiple eyelid morphological parameters, such as MRD1, MRD2, upper eyelid length and corneal area, were automatically measured by our deep learning-based image analysis. Agreement between manual and automated measurements, as well as two repeated automated measurements of MRDs were analysed. Preoperative and postoperative eyelid morphological parameters were compared. Postoperative eyelid contour symmetry was evaluated using multiple mid-pupil lid distances (MPLDs).
Results
The intraclass correlation coefficients (ICCs) between manual and automated measurements of MRDs ranged from 0.934 to 0.971 (
p
< .001), and the bias ranged from 0.09 mm to 0.15 mm. The ICCs between two repeated automated measurements were up to 0.999 (
p
< .001), and the bias was no more than 0.002 mm. After surgery, MRD1 increased significantly from 0.31 ± 1.17 mm to 2.89 ± 1.06 mm, upper eyelid length from 19.94 ± 3.61 mm to 21.40 ± 2.40 mm, and corneal area from 52.72 ± 15.97 mm
2
to 76.31 ± 11.31mm
2
(all
p
< .001). Postoperative binocular MPLDs at different angles (from 0° to 180°) showed no significant differences in the patients.
Conclusion
This technique had high accuracy and repeatability for automatically measuring eyelid morphology, which allows objective assessment of blepharoptosis surgical outcomes. Using only patients’ photographs, this technique has great potential in diagnosis and management of other eyelid-related diseases.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.