Stair walking is a hazardous activity and a common cause of fatal and non-fatal falls. Previous studies have assessed the role of eye movements in stair walking by asking people to repeatedly go up and down stairs in quiet and controlled conditions, while the role of peripheral vision was examined by giving participants specific fixation instructions or working memory tasks. We here extend this research to stair walking in a natural environment with other people present on the stairs and a now common secondary task: using one’s mobile phone. Results show that using the mobile phone strongly draws one’s attention away from the stairs, but that the distribution of gaze locations away from the phone is little influenced by using one’s phone. Phone use also increased the time needed to walk the stairs, but handrail use remained low. These results indicate that limited foveal vision suffices for adequate stair walking in normal environments, but that mobile phone use has a strong influence on attention, which may pose problems when unexpected obstacles are encountered.
Eye tracking studies have suggested that, when viewing images centrally presented on a computer screen, observers tend to fixate the middle of the image. This so-called `central bias' was later also observed in mobile eye tracking during outdoors navigation, where observers were found to fixate the middle of the head-centered video image. It is unclear, however, whether the extension of the central bias to mobile eye tracking in outdoors navigation may have been due to the relatively long viewing distances towards objects in this task and the constant turning of the body in the direction of motion, both of which may have reduced the need for large amplitude eye movements. To examine whether the central bias in day-to-day viewing is related to the viewing distances involved, we here compare eye movements in three tasks (indoors navigation, tea making, and card sorting), each associated with interactions with objects at different viewing distances. Analysis of gaze positions showed a central bias for all three tasks that was independent of the task performed. These results confirm earlier observations of the central bias in mobile eye tracking data, and suggest that differences in the typical viewing distance during different tasks have little effect on the bias. The results could have interesting technological applications, in which the bias is used to estimate the direction of gaze from head-centered video images, such as those obtained from wearable technology.
The study of lithic technology can provide information on human cultural evolution. This article aims to analyse visual behaviour associated with the exploration of ancient stone artefacts and how this relates to perceptual mechanisms in humans. In Experiment 1, we used eye tracking to record patterns of eye fixations while participants viewed images of stone tools, including examples of worked pebbles and handaxes. The results showed that the focus of gaze was directed more towards the upper regions of worked pebbles and on the basal areas for handaxes. Knapped surfaces also attracted more fixation than natural cortex for both tool types. Fixation distribution was different to that predicted by models that calculate visual salience. Experiment 2 was an online study using a mouse-click attention tracking technique and included images of unworked pebbles and ‘mixed’ images combining the handaxe's outline with the pebble's unworked texture. The pattern of clicks corresponded to that revealed using eye tracking and there were differences between tools and other images. Overall, the findings suggest that visual exploration is directed towards functional aspects of tools. Studies of visual attention and exploration can supply useful information to inform understanding of human cognitive evolution and tool use.
In everyday conversation, we often use indirect replies to save face of our interlocutor (e.g., “Your paper does have room for improvement”). Six experiments were conducted to examine the role of verbal and nonverbal behaviors in the production and comprehension of indirect replies. In Experiments 1a and 1b, participants engaged in question-answer exchanges designed to elicit four types of replies (i.e., direct, indirect, lie, and neutral). Results showed that uncertainty terms, discourse markers and head tilt were most uniquely associated with the production of indirect replies. In Experiments 2a, 2b, 3a and 3b, participants categorized the types of replies in video clips of real participants in Experiments 1a and 1b. Results showed that nonverbal behaviors enhanced the performance and boosted the confidence in the identification of indirect replies. Furthermore, uncertainty terms, discourse markers and head tilt were also the most reliable cues for identifying indirect replies. Finally, the extent to which people relied on verbal and nonverbal cues to identify an indirect reply was context dependent. The more informative the verbal/nonverbal information was, the fewer nonverbal/verbal cues that contributed to the identification of indirect replies. Our results demonstrated that people integrate verbal and nonverbal information to enhance their understanding of the intended meaning in indirect replies. Findings from the current research provide an initial step toward developing a comprehensive and unified model of the production and comprehension of indirect replies, which takes both verbal and nonverbal behaviors into account.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.