An increasing number of studies have shown that cross-modal interaction can occur in early sensory cortices. Yet, how neurons in sensory cortices integrate multisensory cues in perceptual tasks and to what extent this influences behavior is largely unclear. To investigate, we examined visual modulation of auditory responses in the primary auditory cortex (A1) in a two-alternative forced-choice task. During the task, male rats were required to make a behavioral choice based on the pure tone frequency (low vs high) of the self-triggered stimulus to get a water reward. The result showed that the presence of a noninformative visual cue did not uniformly influence auditory response, with frequently enhancing just one of them. Closely correlated with behavioral choice, the visual cue mainly enhanced responsiveness to the auditory cue indicating a movement direction contralateral to A1 being recorded. Operating in this fashion provided A1 neurons a superior capability to discriminate sound during multisensory trials. Concomitantly, behavioral data and decoding analysis revealed that visual cue presence could speed the process of sound discrimination. We also observed this differential multisensory integration effect in well-trained rats when tested with passive stimulation and under anesthesia, albeit to a much lesser extent. We did not see this differentially integrative effect while recording in A1 in another similar group of rats performing a free-choice task. These data suggest that auditory cortex can engage in meaningful audiovisual processing, and perceptual learning can modify its multisensory integration mechanism to meet task requirements.SIGNIFICANCE STATEMENTIn the natural environment, visual stimuli are frequently accompanied by auditory cues. Although multisensory integration has traditionally been seen as a feature of associational cortices, recent studies have shown that cross-modal inputs can also influence neuronal activity in primary sensory cortices. However, exactly how neurons in sensory cortices integrate multisensory cues to guide behavioral choice is still unclear. Here, we describe a novel model of multisensory integration used by A1 neurons to shape auditory representations when rats performed a cue-guided task. We found that a task-irrelevant visual cue could specifically enhance the response of neurons in sound guiding to the contralateral choice. This differentially integrative model facilitated sound discrimination and behavioral choice. This result indicates that task engagement can modulate multisensory integration.
Cross-modal interaction (CMI) could significantly influence the perceptional or decision-making process in many circumstances. However, it remains poorly understood what integrative strategies are employed by the brain to deal with different task contexts. To explore it, we examined neural activities of the medial prefrontal cortex (mPFC) of rats performing cue-guided two-alternative forced-choice tasks. In a task requiring rats to discriminate stimuli based on auditory cue, the simultaneous presentation of an uninformative visual cue substantially strengthened mPFC neurons' capability of auditory discrimination mainly through enhancing the response to the preferred cue. Doing this also increased the number of neurons revealing a cue preference. If the task was changed slightly and a visual cue, like the auditory, denoted a specific behavioral direction, mPFC neurons frequently showed a different CMI pattern with an effect of cross-modal enhancement best evoked in information-congruent multisensory trials. In a choice free task, however, the majority of neurons failed to show a cross-modal enhancement effect and cue preference. These results indicate that CMI at the neuronal level is context-dependent in a way that differs from what has been shown in previous studies.
Cross-modal interaction (CMI) could significantly influence the perceptional or decision-making process in many circumstances. However, it remains poorly understood what integrative strategies are employed by the brain to deal with different task contexts. To explore it, we examined neural activities of the medial prefrontal cortex (mPFC) of rats performing cue-guided two-alternative forced-choice tasks. In a task requiring rats to discriminate stimuli based on auditory cue, the simultaneous presentation of an uninformative visual cue substantially strengthened mPFC neurons' capability of auditory discrimination mainly through enhancing the response to the preferred cue. Doing this also increased the number of neurons revealing a cue preference. If the task was changed slightly and a visual cue, like the auditory, denoted a specific behavioral direction, mPFC neurons frequently showed a different CMI pattern with an effect of cross-modal enhancement best evoked in information-congruent multisensory trials. In a choice free task, however, the majority of neurons failed to show a cross-modal enhancement effect and cue preference. These results indicate that CMI at the neuronal level is context-dependent in a way that differs from what has been shown in previous studies.
Cross-modal interaction (CMI) could significantly influence the perceptional or decision-making process in many circumstances. However, it remains poorly understood what integrative strategies are employed by the brain to deal with different task contexts. To explore it, we examined neural activities of the medial prefrontal cortex (mPFC) of rats performing cue-guided two-alternative forced-choice tasks. In a task requiring rats to discriminate stimuli based on auditory cue, the simultaneous presentation of an uninformative visual cue substantially strengthened mPFC neurons' capability of auditory discrimination mainly through enhancing the response to the preferred cue. Doing this also increased the number of neurons revealing a cue preference. If the task was changed slightly and a visual cue, like the auditory, denoted a specific behavioral direction, mPFC neurons frequently showed a different CMI pattern with an effect of cross-modal enhancement best evoked in information-congruent multisensory trials. In a choice free task, however, the majority of neurons failed to show a cross-modal enhancement effect and cue preference. These results indicate that CMI at the neuronal level is context-dependent in a way that differs from what has been shown in previous studies.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.