Eye tracking is a widely used tool for behavioral research in the field of psychology. With technological advancement, we now have specialized eye-tracking devices that offer high sampling rates, up to 2000 Hz, and allow for measuring eye movements with high accuracy. They also offer high spatial resolution, which enables the recording of very small movements, like drifts and microsaccades. Features and parameters of interest that characterize eye movements need to be algorithmically extracted from raw data as most eye trackers identify only basic parameters, such as blinks, fixations, and saccades. Eye-tracking experiments may investigate eye movement behavior in different groups of participants and in varying stimuli conditions. Hence, the analysis stage of such experiments typically involves two phases, (i) extraction of parameters of interest and (ii) statistical analysis between different participants or stimuli conditions using these parameters. Furthermore, the datasets collected in these experiments are usually very large in size, owing to the high temporal resolution of the eye trackers, and hence would benefit from an automated analysis toolkit. In this work, we present PyTrack, an end-to-end open-source solution for the analysis and visualization of eye-tracking data. It can be used to extract parameters of interest, generate and visualize a variety of gaze plots from raw eye-tracking data, and conduct statistical analysis between stimuli conditions and subject groups. Electronic supplementary material The online version of this article (10.3758/s13428-020-01392-6) contains supplementary material, which is available to authorized users.
Auditory stimuli have been shown to alter visual temporal perception. For example, illusory temporal order is perceived when an auditory tone cues one side of space prior to the onset of simultaneously presented visual stimuli. Competing accounts attempt to explain such effects. The spatial gradient account of attention suggests speeded processing of visual stimuli in the cued space, whereas the impletion account suggests a Gestalt-like process where an attempt is made to arrive at a "realistic" representation of an event given ambiguous conditions. Temporal ventriloquismwhere visual temporal order judgment performance is enhanced when a spatially uninformative tone is presented prior to, and after, visual stimuli onset-argues that the temporal relationship of the auditory stimuli to visual stimuli, as well as the number of auditory stimuli equaling the visual stimuli, drives the mechanisms underlying these and related effects. Results from a series of experiments highlight putative inconsistencies in both the spatial gradient account of attention and the classical temporal ventriloquism account. We present novel behavioral effects-illusory temporal order via spatially uninformative tones, and illusory simultaneity via a single tone prior to visual stimuli onset-that can be accounted for by an expanded version of the impletion account. Public Significance StatementThe present study demonstrates novel audio-induced visual-temporal-order effects using spatially neutral tones, while replicating related classic audio-visual effects. We interpret these findings as evidence that audio-visual integration takes evidence from various processes, assigning different weightings to each process dependent upon relative spatial locations, temporal characteristics, relative number of stimuli, and featural characteristics. With this interpretation in mind, we propose a unifying account of the observed effects. In addition, we suggest the use of the paradigms within this article (and the associated effects) should be considered as part of sensory testing when measuring typical audio-visual integration, such as in cases of cochlear implantation.
Visual illusions are fascinating phenomena that have been used and studied by artists and scientists for centuries, leading to important discoveries about the neurocognitive underpinnings of perception, consciousness, and neuropsychiatric disorders such as schizophrenia or autism. Surprisingly, despite their historical and theoretical importance as psychological stimuli, there is no dedicated software, nor consistent approach, to generate illusions in a systematic fashion. Instead, scientists have to craft them by hand in an idiosyncratic fashion, or use pre-made images not tailored for the specific needs of their studies. This, in turn, hinders the reproducibility of illusion-based research, narrowing possibilities for scientific breakthroughs and their applications. With the aim of addressing this gap, Pyllusion is a Python-based open-source software (freely available at https://github.com/RealityBending/Pyllusion ), that offers a framework to manipulate and generate illusions in a systematic way, compatible with different output formats such as image files (.png, .jpg, .tiff, etc.) or experimental software (such as PsychoPy).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.