When attempting to understand where people look during scene perception, researchers typically focus on the relative contributions of low-and high-level cues. Computational models of the contribution of low-level features to fixation selection, with modifications to incorporate top-down sources of information have been abundant in recent research. However, we are still some way from a model that can explain many of the complexities of eye movement behaviour. Here we show that understanding biases in how we move the eyes can provide powerful new insights into the decision about where to look in complex scenes. A model based solely on these biases and therefore blind to current visual information outperformed popular salience-based approaches. Our data show that incorporating an understanding of oculomotor behavioural biases into models of eye guidance is likely to significantly improve our understanding of where we choose to fixate in natural scenes.Successfully completing many forms of behaviour requires that humans look in the right place at the right time. Ballard and colleagues described this as a ''do-it-where-I'm-looking'' visual strategy for completing complex tasks (Ballard et al., 1992); a finding that has been replicated across a range of studies of natural behaviour (e.g.
We recorded over 90,000 saccades while observers viewed a diverse collection of natural images and measured low level visual features at fixation. The features that discriminated between where observers fixated and where they did not varied considerably with task, and the length of the preceding saccade. Short saccades (<8 degrees) are image feature dependent, long are less so. For free viewing, short saccades target high frequency information, long saccades are scale-invariant. When searching for luminance targets, saccades of all lengths are scale-invariant. We argue that models of saccade behaviour must account not only for task but also for saccade length and that long and short saccades are targeted differently.
While many current models of scene perception debate the relative roles of low- and highlevel factors in eye guidance, systematic tendencies in how the eyes move may be informative. We consider how each saccade and fixation is influenced by that which preceded or followed it, during free inspection of images of natural scenes. We find evidence to suggest periods of localized scanning separated by ‘global’ relocations to new regions of the scene. We also find evidence to support the existence of small amplitude ‘corrective’ saccades in natural image viewing. Our data reveal statistical dependencies between successive eye movements, which may be informative in furthering our understanding of eye guidance.
A state-of-the-art data analysis procedure is presented to conduct hierarchical Bayesian inference and hypothesis testing on delay discounting data. The delay discounting task is a key experimental paradigm used across a wide range of disciplines from economics, cognitive science, and neuroscience, all of which seek to understand how humans or animals trade off the immediacy verses the magnitude of a reward. Bayesian estimation allows rich inferences to be drawn, along with measures of confidence, based upon limited and noisy behavioural data. Hierarchical modelling allows more precise inferences to be made, thus using sometimes expensive or difficult to obtain data in the most efficient way. The proposed probabilistic generative model describes how participants compare the present subjective value of reward choices on a trial-to-trial basis, estimates participant-and group-level parameters. We infer discount rate as a function of reward size, allowing the magnitude effect to be measured. Demonstrations are provided to show how this analysis approach can aid hypothesis testing. The analysis is demonstrated on data from the popular 27-item monetary choice questionnaire (Kirby, 2009), but will accept data from a range of protocols, including adaptive procedures. The software is made freely available to researchers.Keywords Decision making · Delay discounting · Intertemporal choice · magnitude effect · Time preference · Bayesian estimation · MCMC · Financial psychophysics The analysis code is freely downloadable from https://github. com/drbenvincent/delay-discounting-analysis.
The allocation of overt visual attention while viewing photographs of natural scenes is commonly thought to involve both bottom-up feature cues, such as luminance contrast, and top-down factors such as behavioural relevance and scene understanding. Profiting from the fact that light sources are highly visible but uninformative in visual scenes, we develop a mixture model approach that estimates the relative contribution of various low and high-level factors to patterns of eye movements whilst viewing natural scenes containing light sources. Low-level salience accounts predicted fixations at luminance contrast and at lights, whereas these factors played only a minor role in the observed human fixations. Conversely, human data were mostly explicable in terms of a central bias and a foreground preference. Moreover, observers were more likely to look near lights rather than directly at them, an effect that cannot be explained by low-level stimulus factors such as luminance or contrast. These and other results support the idea that the visual system neglects highly visible cues in favour of less visible object information. Mixture modelling might be a good way forward in understanding visual scene exploration, since it makes it possible to measure the extent that low-level or highlevel cues act as drivers of eye movements.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.