Recent evidence shows a novel role for eye vergence in orienting attention in adult subjects. Here we investigated whether such modulation in eye vergence by attention is present in children and whether it is altered in children with ADHD compared to control subjects. We therefore measured the angle of eye vergence in children previously diagnosed with ADHD while performing a cue task and compared the results to those from age-matched controls. We observed a strong modulation in the angle of vergence in the control group and a weak modulation in the ADHD group. In addition, in the control group the modulation in eye vergence was different between the informative cue and uninformative cue condition. This difference was less noticeable in the ADHD group. Our study supports the observation of deficient binocular vision in ADHD children. We argue that the observed disruption in vergence modulation in ADHD children is manifest of altered cognitive processing of sensory information. Our work may provide new insights into attention disorders, like ADHD.
Orienting visual attention is closely linked to the oculomotor system. For example, a shift of attention is usually followed by a saccadic eye movement and can be revealed by micro saccades. Recently we reported a novel role of another type of eye movement, namely eye vergence, in orienting visual attention. Shifts in visuospatial attention are characterized by the response modulation to a selected target. However, unlike (micro-) saccades, eye vergence movements do not carry spatial information (except for depth) and are thus not specific to a particular visual location. To further understand the role of eye vergence in visual attention, we tested subjects with different perceptual styles. Perceptual style refers to the characteristic way individuals perceive environmental stimuli, and is characterized by a spatial difference (local vs. global) in perceptual processing. We tested field independent (local; FI) and field dependent (global; FD) observers in a cue/no-cue task and a matching task. We found that FI observers responded faster and had stronger modulation in eye vergence in both tasks than FD subjects. The results may suggest that eye vergence modulation may relate to the trade-off between the size of spatial region covered by attention and the processing efficiency of sensory information. Alternatively, vergence modulation may have a role in the switch in cortical state to prepare the visual system for new incoming sensory information. In conclusion, vergence eye movements may be added to the growing list of functions of fixational eye movements in visual perception. However, further studies are needed to elucidate its role.
Neural mechanisms of attention allow selective sensory information processing. Top-down deployment of visual-spatial attention is conveyed by cortical feedback connections from frontal regions to lower sensory areas modulating late stimulus responses. A recent study reported the occurrence of small eye vergence during orienting top-down attention. Here we assessed a possible link between vergence and attention by comparing visual event related potentials (vERPs) to a cue stimulus that induced attention to shift towards the target location to the vERPs to a no-cue stimulus that did not trigger orienting attention. The results replicate the findings of eye vergence responses during orienting attention and show that the strength and time of eye vergence coincide with the onset and strength of the vERPs when subjects oriented attention. Our findings therefore support the idea that eye vergence relates to and possibly has a role in attentional selection.
Although the perceptual association between verticality and pitch has been widely studied, the link between loudness and verticality is not fully understood yet. While loud and quiet sounds are assumed to be equally associated crossmodally with spatial elevation, there are perceptual differences between the two types of sounds that may suggest the contrary. For example, loud sounds tend to generate greater activity, both behaviourally and neurally, than quiet sounds. Here we investigated whether this difference percolates into the crossmodal correspondence between loudness and verticality. In an initial phase, participants learned one-to-one arbitrary associations between two tones differing in loudness (82dB vs. 56dB) and two coloured rectangles (blue vs. yellow). During the experimental phase, they were presented with the two-coloured stimuli (each one located above or below a central “departure” point) together with one of the two tones.Participants had to indicate which of the two-coloured rectangles corresponded to the previously-associated tone by moving a mouse cursor from the departure point towards the target. The results revealed that participants were significantly faster responding to the loud tone when the visual target was located above (congruent condition) than when the target was below the departure point (incongruent condition). For quiet tones, no differences were found between the congruent (quiet-down) and the incongruent (quiet-up) conditions. Overall, this pattern of results suggests that possible differences in the neural activity generated by loud and quiet sounds influence the extent to which loudness and spatial elevation share representational content.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.