This paper describes a method f o r quickly and robustly localizing the iris and pupil boundaries of a human eye in close-up images. Such an algorithm can be critical for iris identifcotion, or for applications that must detemzine the subject's gaze direction, e.g., human-computer interaction or driver attentiveness determination. A multi-resolution coarse-toTfne search appmach is used, seeking to maximize gradient strengths and uniformities measured across rays radiating from a candidate iris or ptipil's cenrral poinr. An empirical evaluation of 670 eye images. both with and without glasses, resulted in a 98% localization accuracy. The algorithm has also shown robustness tu weak illuminarion and mosf specular repertions (e.g., at exewear and cornea), simplij+)ing system component requirements. Rapid execution is achieved on a 750 M H z desktop processor
lgorithms based on the correlation of image patches can be robust in practice but are computationally intensive due to the computational complexity of their search-based nature. PerformingAt he search over time instead of over space is linear in nature, rather than quadratic, and results in a very efficient algorithm. This, combined with implementations which are highly efficient on standard computing hardware, yields performance of 9 frames/sec on a scientific workstation. Although the resulting velocities are quantized with resulting quantization error, they have been shown to be sufficiently accurate for many robotic vision tasks such as time-to-collision and robotic navigation. Thus, this algorithm is highly suitable for real-time robotic vision research.
A real-time robot vision system is described which uses only the divergence of the optical $ow field for both steering control and collision detection. The robot has wandered about the lab at 20 c d s for as long as 26 minutes without collision. The entire system is implemented on a single ordinary UNIX workstation without the benefit of real-time operating system support. Dense opticaljlow data are calculated in real-time across the entire wide-angle image. The divergence of this optical $ow field is calculated everywhere and used to control steering and collision behaviol: Divergence alone has proven sufJicient for steering past objects and detecting imminent collision. The m j o r contribution is the demonstration of a simple, robust, minimal system that uses $ow-derived measures to control steering and speed to avoid collision in real time for extended periods. Such a system can be embedded in a general, multi-level perceptiodcontrol system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.