Summary Change detection is a popular task to study visual short-term memory (STM) in humans [1–4]. Much of this work suggests that STM has a fixed capacity of 4 ± 1 items [1–6]. Here we report the first comparison of change detection memory between humans and a species closely related to humans, the rhesus monkey. Monkeys and humans were tested in nearly identical procedures with overlapping display sizes. Although the monkeys’ STM was well fit by a 1-item fixed-capacity memory model, other monkey memory tests with 4-item lists have shown performance impossible to obtain with a 1-item capacity [7]. We suggest that this contradiction can be resolved using a continuous-resource approach more closely tied to the neural basis of memory [8,9]. In this view, items have a noisy memory representation whose noise level depends on display size due to distributed allocation of a continuous resource. In accord with this theory, we show that performance depends on the perceptual distance between items before and after the change, and d′ depends on display size in an approximately power law fashion. Our results open the door to combining the power of psychophysics, computation, and physiology to better understand the neural basis of STM.
Certain sounds, such as fingernails screeching down a chalkboard, have a strong association with somatosensory percepts. In order to assess the influences of audition on somatosensory perception, three experiments measured how task-irrelevant auditory stimuli alter detection rates for near-threshold somatosensory stimuli. In Experiment 1, we showed that a simultaneous auditory stimulus increases sensitivity, but not response biases, to the detection of an electrical cutaneous stimulus delivered to the hand. Experiment 2 demonstrated that this enhancement of somatosensory perception is spatially specific--only monaural sounds on the same side increased detection. Experiment 3 revealed that the effects of audition on touch are also frequency dependent--only sounds with the same frequency as the vibrotactile frequency enhanced tactile detection. These results indicate that auditory information influences touch perception in highly systematic ways and suggest that similar coding mechanisms may underlie the processing of information from these different sensory modalities.
1Creating three-dimensional (3D) representations of the world from two-dimensional retinal images 2 is fundamental to many visual guided behaviors including reaching and grasping. A critical 3 component of this process is determining the 3D orientation of objects. Previous studies have 4shown that neurons in the caudal intraparietal area (CIP) of the macaque monkey represent 3D 5 planar surface orientation (i.e., slant and tilt). Here we compare the responses of neurons in areas 6 V3A (which is implicated in 3D visual processing and which precedes CIP in the visual hierarchy) 7 and CIP to 3D oriented planar surfaces. We then examine whether activity in these areas 8correlates with perception during a fine slant discrimination task in which monkeys report if the 9 top of a surface is slanted towards or away from them. Although we find that V3A and CIP neurons 10show similar sensitivity to planar surface orientation, significant choice-related activity during the 11 slant discrimination task is rare in V3A but prominent in CIP. These results implicate both V3A 12 and CIP in the representation of 3D surface orientation, and suggest a functional dissociation 13 between the areas based on slant-related decision signals. 14 15 16 17 Significance Statement 18Surface orientation perception is fundamental to visually guided behaviors such as reaching, 19 grasping, and navigation. Previous studies implicate the caudal intraparietal area (CIP) in the 20 representation of 3D surface orientation. Here we show that responses to 3D oriented planar 21 surfaces are similar in CIP and V3A, which precedes CIP in the cortical hierarchy. However, we 22 also find a qualitative distinction between the two areas: only CIP neurons show robust choice-23 related activity during a fine visual orientation discrimination task. 24
Six pigeons were trained in a change detection task with four colors. They were shown two colored circles on a sample array, followed by a test array with the color of one circle changed. The pigeons learned to choose the changed color and transferred their performance to four unfamiliar colors, suggesting that they had learned a generalized concept of color change. They also transferred performance to test delays several times their 50-msec training delay without prior delay training. The accurate delay performance of several seconds suggests that their change detection was memory based, as opposed to a perceptual attentional capture process. These experiments are the first to show that an animal species (pigeons, in this case) can learn a change detection task identical to ones used to test human memory, thereby providing the possibility of directly comparing short-term memory processing across species.
Two monkeys learned a color change-detection task where two colored circles (selected from a 4-color set) were presented on a 4×4 invisible matrix. Following a delay, the correct response was to touch the changed colored circle. The monkeys' learning, color transfer, and delay transfer were compared to a similar experiment with pigeons. Monkeys, like pigeons, showed full transfer to four novel colors, and to delays as long as 6.4 s, suggesting they remembered the colors as opposed to perceptual based attentional capture process that may work at very short delays. The monkeys and pigeons were further tested to compare transfer to other dimensions. Monkeys transferred to shape and location changes, unlike the pigeons, but neither species transferred to size changes. Thus, monkeys were less restricted in their domain to detect change than pigeons, but both species learned the basic task and appear suitable for comparative studies of visual short-term memory.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.