Abstract-Ocular biometrics refers to the imaging and use of characteristic features of the eyes for personal identification. Traditionally, the iris has been viewed as a powerful ocular biometric cue. However, the iris is typically imaged in the near infrared (NIR) spectrum. RGB images of the iris, acquired in the visible spectrum, offer limited biometric information for dark-colored irides. In this work, we explore the possibility of performing ocular biometric recognition in the visible spectrum by utilizing the iris in conjunction with the vasculature observed in the white of the eye. We design a weighted fusion scheme to combine the information originating from these two modalities. Experiments on a dataset of 50 subjects indicate that such a fusion scheme improves the equal error rate by a margin of 4.5% over an iris-only approach.
Securing personal information on handheld devices, especially smartphones, has gained a significant interest in recent years. Yet, most of the popular biometric modalities require additional hardware. To overcome this difficulty, the authors propose utilising the existing visible light cameras in mobile devices. Leveraging visible vascular patterns on whites of the eye, they develop a method for biometric authentication suitable for smartphones. They start their process by imaging and segmenting whites of the eyes, followed by image quality assessment. The authors' stage 1 matcher is a three-step process that entails extracting interest points [Harris-Stephens, features from accelerated segment test, and speeded up robust features (SURF)], building features (SURF and fast retina keypoint) around those points, and match score generation using random sample consensus-based registration. Stage 2 matcher uses registered Gabor phase filtered images to generate orientation of local binary pattern features for its correlation-based match metric. A fusion of stage 1 and stage 2 match scores is calculated for the final decision. Using a dataset of 226 users, the authors' results show equal error rates as low as 0.04% for long-term verification tests. The success of their framework is further validated on UBIRIS v1 database.
Directional pyramidal filter banks as feature extractors for ocular vascular biometrics are proposed. Apart from the red, green, and blue (RGB) format, we analyze the significance of using HSV, YCbCr, and layer combinations (R+Cr)/2, (G+Cr)/2, (B+Cr)/2. For classification, Linear Discriminant Analysis (LDA) is used. We outline the advantages of a Contourlet transform implementation for eye vein biometrics, based on vascular patterns seen on the white of the eye. The performance of the proposed algorithm is evaluated using Receiver Operating Characteristic (ROC) curves. Area under the curve (AUC), equal error rate (EER), and decidability values are used as performance metrics. The dataset consists of more than 1600 still images and video frames acquired in two separate sessions from 40 subjects. All images were captured from a distance of 5 feet using a DSLR camera with an attached white LED light source. We evaluate and discuss the results of cross matching features extracted from still images and video recordings of conjunctival vasculature patterns. The best AUC value of 0.9999 with an EER of 0.064% resulted from using Cb layer in YCbCr color space. The best (lowest value) EER of 0.032% was obtained with an AUC value of 0.9998 using the green layer of the RGB images.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations –citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.