For the past 20 years, researchers have investigated the use of eye tracking in security applications. We present a holistic view on gaze-based security applications. In particular, we canvassed the literature and classify the utility of gaze in security applications into a) authentication, b) privacy protection, and c) gaze monitoring during security critical tasks. This allows us to chart several research directions, most importantly 1) conducting field studies of implicit and explicit gaze-based authentication due to recent advances in eye tracking, 2) research on gaze-based privacy protection and gaze monitoring in security critical tasks which are under-investigated yet very promising areas, and 3) understanding the privacy implications of pervasive eye tracking. We discuss the most promising opportunities and most pressing challenges of eye tracking for security that will shape research in gaze-based security applications for the next decade.
Figure 1: Thermal images of graphical passwords entered on a smartphone's touchscreen (1 and 2) and a laptop's touchpad (3 and 4) were visually inspected by participants, who recovered 60.65% of touch gestures (2 and 4), and 23.61% of touch taps (1 and 3). Attacks against touchscreens are more accurate (87.04% vs 56.02%). The red circles/arrows illustrate the user's input.
Eye-gaze and mid-air gestures are promising for resisting various types of side-channel attacks during authentication. However, to date, a comparison of the different authentication modalities is missing. We investigate multiple authentication mechanisms that leverage gestures, eye gaze, and a multimodal combination of them and study their resilience to shoulder surfing. To this end, we report on our implementation of three schemes and results from usability and security evaluations where we also experimented with fixed and randomized layouts. We found that the gaze-based approach outperforms the other schemes in terms of input time, error rate, perceived workload, and resistance to observation attacks, and that randomizing the layout does not improve observation resistance enough to warrant the reduced usability. Our work further underlines the significance of replicating previous eye tracking studies using today's sensors as we show significant improvement over similar previously introduced gaze-based authentication systems.
We investigate the effectiveness of thermal attacks against input of text with different characteristics; we study text entry on a smartphone touchscreen and a laptop keyboard. First, we ran a study (N=25) to collect a dataset of thermal images of short words, websites, complex strings (special characters, numbers, letters), passphrases and words with duplicate characters. Afterwards, 20 different participants visually inspected the thermal images to attempt to identify the text input. We found that long and complex strings are less vulnerable to thermal attacks, that visual inspection of thermal images reveals different parts of the entered text (36% on average and up to 82%) even if the attack is not fully successful, and that entering text on laptops is more vulnerable to thermal attacks than on smartphones. We conclude with three learned lessons and recommendations to resist thermal attacks.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.