To investigate self-prioritization independently of stimulus familiarity, Sui et al. (J Exp Psychol Hum Percept Perform 38:1105-1117, 2012. doi: 10.1037/a0029792 ) introduced a new paradigm in which different geometric shapes are arbitrarily associated with self-relevant (e.g., "I") and neutral labels (e.g., "stranger"). It has now been repeatedly demonstrated that in a subsequently presented matching task, this association leads to faster and more accurate verifications of self-relevant shape-label pairings than neutral shape-label pairings. In order to assess whether this self-prioritization effect represents a general selection mechanism in human information processing, we examined whether it is limited to the visual modality. Therefore, besides visual stimuli, auditory and vibrotactile stimuli were also associated either to self-relevant or to neutral labels. The findings demonstrate that self-prioritization represents a general tendency influencing human information processing, one that operates across the senses. Our results also highlight a top-down component to self-prioritization.
The human brain is adapted to integrate the information from multiple sensory modalities into coherent, robust representations of the objects and events in the external world. A large body of empirical research has demonstrated the ubiquitous nature of the interactions that take place between vision and touch, with the former typically dominating over the latter. Many studies have investigated the influence of visual stimuli on the processing of tactile stimuli (and vice versa). Other studies, meanwhile, have investigated the effect of directing a participant’s gaze either toward or else away from the body-part receiving the target tactile stimulation. Other studies, by contrast, have compared performance in those conditions in which the participant’s eyes have been open versus closed. We start by reviewing the research that has been published to date demonstrating the influence of vision on the processing of tactile targets, that is, on those stimuli that have to be attended or responded to. We outline that many – but not all – of the visuotactile interactions that have been observed to date may be attributable to the direction of spatial attention. We then move on to focus on the crossmodal influence of vision, as well as of the direction of gaze, on the processing of tactile distractors. We highlight the results of those studies demonstrating the influence of vision, rather than gaze direction (i.e., the direction of overt spatial attention), on tactile distractor processing (e.g., tactile variants of the negative-priming or flanker task). The conclusion is that no matter how vision of a tactile distractor is engaged, the result would appear to be the same, namely that tactile distractors are processed more thoroughly.
Research on the nature of crossmodal interactions between vision and touch has shown that even task-irrelevant visual information can support the processing of tactile targets. In the present study, we implemented a tactile variant of the Eriksen flanker task to investigate the influences of vision on the processing of tactile distractors. In particular, we analyzed whether the size of the flanker effect at the level of perceptual congruency and at the level of response compatibility would differ as a function of the availability of vision (Experiments 1 and 2). Tactile distractors were processed up to the level of response selection only if visual information was provided (i.e., no flanker effects were observed at the level of response compatibility for blindfolded participants). In Experiment 3, we manipulated whether the part of the body receiving the tactile target or distractor was visible, while the other body part was occluded from view. Flanker effects at the level of response compatibility were observed in both conditions, meaning that vision of either the body part receiving the tactile target or the body part receiving the tactile distractor was sufficient to further the processing of tactile distractors from the level of perceptual congruency to the level of response selection. Taken together, these results suggest that vision modulates tactile distractor processing because it results in the processing of tactile distractors up to the level of response selection.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.