The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be modulated by the locations of visual targets, 19% by auditory targets, and 9% by both visual and auditory targets. The reference frame for both visual and auditory receptive fields ranged along a continuum between eye- and head-centered reference frames with approximately 10% of auditory and 33% of visual neurons having receptive fields that were more consistent with an eye- than a head-centered frame of reference and 23 and 18% having receptive fields that were more consistent with a head- than an eye-centered frame of reference, leaving a large fraction of both visual and auditory response patterns inconsistent with both head- and eye-centered reference frames. The results were similar to the reference frame we have previously found for auditory stimuli in the inferior colliculus and core auditory cortex. The correspondence between the visual and auditory receptive fields of individual neurons was weak. Nevertheless, the visual and auditory responses were sufficiently well correlated that a simple one-layer network constructed to calculate target location from the activity of the neurons in our sample performed successfully for auditory targets even though the weights were fit based only on the visual responses. We interpret these results as suggesting that although the representations of space in areas LIP and MIP are not easily described within the conventional conceptual framework of reference frames, they nevertheless process visual and auditory spatial information in a similar fashion.
We examined the frame of reference of auditory responses in the inferior colliculus in monkeys fixating visual stimuli at different locations. Eye position modulated the level of auditory responses in 33% of the neurons we encountered, but it did not appear to shift their spatial tuning. The effect of eye position on auditory responses was substantial-comparable in magnitude to that of sound location. The eye position signal appeared to interact with the auditory responses in at least a partly multiplicative fashion. We conclude that the representation of sound location in primate IC is distributed and that the frame of reference is intermediate between head- and eye-centered coordinates. The information contained in these neurons appears to be sufficient for later neural stages to calculate the positions of sounds with respect to the eyes.
To generate behavioral responses based on sensory input, motor areas of the brain must interpret, or "read out," signals from sensory maps. Our experiments tested several algorithms for how the motor systems for smooth pursuit and saccadic eye movements might extract a usable signal of target velocity from the distributed representation of velocity in the middle temporal visual area (MT or V5). Using microstimulation, we attempted to manipulate the velocity information within MT while monkeys tracked a moving visual stimulus. We examined the effects of the microstimulation on smooth pursuit and on the compensation for target velocity shown by saccadic eye movements. Microstimulation could alter both the speed and direction of the motion estimates of both types of eye movements and could also cause monkeys to generate pursuit even when the visual target was actually stationary. The pattern of alterations suggests that microstimulation can introduce an additional velocity signal into MT and that the pursuit and saccadic systems usually compute a vector average of the visually evoked and microstimulation-induced velocity signals (pursuit, 55 of 122 experiments; saccades, 70 of 122). Microstimulation effects in a few experiments were consistent with vector summation of these two signals (pursuit, 6 of 122; saccades, 2 of 122). In the remainder of the experiments, microstimulation caused either an apparent impairment in motion processing (pursuit, 47 of 122; saccades, 41 of 122) or had no effect (pursuit, 14 of 122; saccades, 9 of 122). Within individual experiments, the effects on pursuit and saccades were usually similar, but the occasional striking exception suggests that the two eye movement systems may perform motion computations somewhat independently.
Taken together with emerging results in both visual and other auditory areas, these findings suggest that neurons whose responses reflect complex interactions between stimulus position and eye position set the stage for the eventual convergence of auditory and visual information.
Advances in drug potency and tailored therapeutics are promoting pharmaceutical manufacturing to transition from a traditional batch paradigm to more flexible continuous processing. Here we report the development of a multistep continuous-flow CGMP (current good manufacturing practices) process that produced 24 kilograms of prexasertib monolactate monohydrate suitable for use in human clinical trials. Eight continuous unit operations were conducted to produce the target at roughly 3 kilograms per day using small continuous reactors, extractors, evaporators, crystallizers, and filters in laboratory fume hoods. Success was enabled by advances in chemistry, engineering, analytical science, process modeling, and equipment design. Substantial technical and business drivers were identified, which merited the continuous process. The continuous process afforded improved performance and safety relative to batch processes and also improved containment of a highly potent compound.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.