Understanding human attention in mobile interaction is a relevant part of human computer interaction, indicating focus of task, emotion and communication. Lack of large scale studies enabling statistically significant results is due to high costs of manual penetration in eye tracking analysis. With high quality wearable cameras for eye-tracking and Google glasses, video analysis for visual attention analysis will become ubiquitous for automated large scale annotation. We describe for the first time precise gaze estimation on mobile displays and surrounding, its performance and without markers. We demonstrate accurate POR (point of regard) recovery on the mobile device and enable heat mapping of visual tasks. In a benchmark test we achieve a mean accuracy in the POR localization on the display by ≈1.5 mm, and the method is very robust to illumination changes. We conclude from these results that this system may open new avenues in eye tracking research for behavior analysis in mobile applications.