SignificanceTo reduce the complexity of our sensory environment, the perceptual system discretizes information in different ways. In the time domain, this is evident when stimuli that are presented very close in time are sometimes faithfully perceived as different entities, whereas they are integrated into a single event at other times. Using multivariate decoding of electroencephalography data, we show that integration and segregation of stimuli over different time scales (a few tens vs. a few hundreds of milliseconds) do not rely on a single sampling rhythm; instead, they depend on the phase of prestimulus oscillations at different frequency bands in right posterior-parietal channels. These findings suggest the existence of a specific mapping between oscillations and temporal windows in perception.
Gaze and arrow cues cause covert attention shifts even when they are uninformative. Nonetheless, it is unclear to what extent oculomotor behavior influences manual responses to social and nonsocial stimuli. In two experiments, we tracked the gaze of participants during the cueing task with nonpredictive gaze and arrow cues. In Experiment 1, the discrimination task was easy and eye movements were not necessary, whereas in Experiment 2 they were instrumental in identifying the target. Validity effects on manual response time (RT) were similar for the two cues in Experiment 1 and in Experiment 2, though in the presence of eye movements observers were overall slower to respond to the arrow cue compared with the gaze cue. Cue direction had an effect on saccadic performance before the discrimination was presented and throughout the duration of the trial. Furthermore, we found evidence of a distinct impact of the type of cue on diverse oculomotor components. While saccade latencies were affected by the type of cue, both before and after the target onset, saccade landing positions were not. Critically, the manual validity effect was predicted by the landing position of the initial eye movement. This work suggests that the relationship between eye movements and attention is not straightforward. In the presence of overt selection, saccade latency related to the overall speed of manual response, while eye movements landing position was closely related to manual performance in response to different cues. Keywords Attention. Eye movements and visual attention. Face perception Orienting spatial attention in response to head turning and eye moving is part and parcel of living in a society. Humans are sensitive to the processing of eye gaze, a preference that is thought to be innate and find its roots in evolution (see, e.g., Hood, Willen, & Driver, 1998). In the course of our lives, we learn to orient attention to more abstract cues like arrows, given that they also convey useful spatial information. We are able to shift our attention according to directions conveyed through signs (e.g., when looking for the right exit from the motorway). Eye gaze or arrows are known as central cuesthat is, stimuli that are presented at the center of the visual field and that enable orientation of attention to another location in space. They differ from peripheral cues, which can capture attention to their location because of an abrupt onset of illumination changes (Posner, 1980). Many studies have been conducted to understand the characteristics of attention orienting to central cues, mainly focusing on the covert component of it (i.e., the orienting of visuospatial attention without observable eye and body movements;
Previous research on covert orienting to the periphery suggested that early profound deaf adults were less susceptible to uninformative gaze cues, though were equally or more affected by non-social arrow cues. The aim of the present work was to investigate whether spontaneous eye movement behaviour helps explain the reduced impact of the social cue in deaf adults. We tracked the gaze of 25 early profound deaf and 25 age-matched hearing observers performing a peripheral discrimination task with uninformative central cues (gaze vs. arrow), stimulus-onset asynchrony (250 vs. 750 ms) and cue-validity (valid vs. invalid) as within-subject factors. In both groups, the cue-effect on RT was comparable for the two cues, although deaf observers responded significantly slower than hearing controls. While deaf and hearing observers eye movement pattern looked similar when the cue was presented in isolation, deaf participants made significantly eye movements than hearing controls once the discrimination target appeared. Notably, further analysis of eye movements in the deaf group revealed that independent of cue-type, cue-validity affected saccade landing position, while latency was not modulated by these factors. Saccade landing position was also strongly related to the magnitude of the validity effect on RT, such that the greater the difference in saccade landing position between invalid and valid trials, the greater the difference in manual RT between invalid and valid trials. This work suggests that the contribution of overt selection in central cueing of attention is more prominent in deaf adults and helps determine the manual performance, irrespective of cue-type.
Multisensory interactions in deaf cognition are largely unexplored. Unisensory studies suggest that behavioral/neural changes may be more prominent for visual compared to tactile processing in early deaf adults. Here we test whether such an asymmetry results in increased saliency of vision over touch during visuo-tactile interactions. About 23 early deaf and 25 hearing adults performed two consecutive visuo-tactile spatial interference tasks. Participants responded either to the elevation of the tactile target while ignoring a concurrent visual distractor at central or peripheral locations (respond to touch/ignore vision), or they performed the opposite task (respond to vision/ignore touch). Multisensory spatial interference emerged in both tasks for both groups. Crucially, deaf participants showed increased interference compared to hearing adults when they attempted to respond to tactile targets and ignore visual distractors, with enhanced difficulties with ipsilateral visual distractors. Analyses on task-order revealed that in deaf adults, interference of visual distractors on tactile targets was much stronger when this task followed the task in which vision was behaviorally relevant (respond to vision/ignore touch). These novel results suggest that behavioral/neural changes related to early deafness determine enhanced visual dominance during visuo-tactile multisensory conflict.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.