Abstract-Ocular biometrics refers to the imaging and use of characteristic features of the eyes for personal identification. Traditionally, the iris has been viewed as a powerful ocular biometric cue. However, the iris is typically imaged in the near infrared (NIR) spectrum. RGB images of the iris, acquired in the visible spectrum, offer limited biometric information for dark-colored irides. In this work, we explore the possibility of performing ocular biometric recognition in the visible spectrum by utilizing the iris in conjunction with the vasculature observed in the white of the eye. We design a weighted fusion scheme to combine the information originating from these two modalities. Experiments on a dataset of 50 subjects indicate that such a fusion scheme improves the equal error rate by a margin of 4.5% over an iris-only approach.
Securing personal information on handheld devices, especially smartphones, has gained a significant interest in recent years. Yet, most of the popular biometric modalities require additional hardware. To overcome this difficulty, the authors propose utilising the existing visible light cameras in mobile devices. Leveraging visible vascular patterns on whites of the eye, they develop a method for biometric authentication suitable for smartphones. They start their process by imaging and segmenting whites of the eyes, followed by image quality assessment. The authors' stage 1 matcher is a three-step process that entails extracting interest points [Harris-Stephens, features from accelerated segment test, and speeded up robust features (SURF)], building features (SURF and fast retina keypoint) around those points, and match score generation using random sample consensus-based registration. Stage 2 matcher uses registered Gabor phase filtered images to generate orientation of local binary pattern features for its correlation-based match metric. A fusion of stage 1 and stage 2 match scores is calculated for the final decision. Using a dataset of 226 users, the authors' results show equal error rates as low as 0.04% for long-term verification tests. The success of their framework is further validated on UBIRIS v1 database.
Motor movements induce distinct patterns in the hemodynamics of the motor cortex, which may be captured by Near-Infrared Spectroscopy (NIRS) for Brain Computer Interfaces (BCI). We present a classification-guided (wrapper) method for time-domain NIRS feature extraction to classify left and right hand movements. Four different wrapper methods, based on univariate and multivariate ranking and sequential forward and backward selection, along with three different classifiers (k-Nearest neighbor, Bayes, and Support Vector Machines) were studied. Using NIRS data from two subjects we show that a rank-based wrapper in conjunction with polynomial SVMs can achieve 100% sensitivity and specificity separating left and right hand movements (5-fold cross validation). Results show the promise of wrapper methods in classifying NIRS signals for BCI applications.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.