This paper presents a novel liveness detection method that exploits the acquisition workflow for iris biometrics on smartphones using a hybrid visible (RGB)/near infra-red (NIR) sensor. These devices are able to capture both RGB and NIR images of the eye and iris region in synchronization. This multi-spectral information is mapped into a discrete feature space. An intermediate classifier which uses a distance metric close to Jenson-Shannon divergence is employed to classify the incoming image. Further, a fast, multi-frame pupil localization technique using onedimensional processing of the eye region is proposed and evaluated. This is used to analyze the pupil characteristics of the images classified as 'live' in the previous stage. It is shown that such an analysis could detect presentation attacks, even with a 3-D face model made of materials that has properties similar to human skin and the ocular region 1 .