The sleeping brain retains some residual information processing capacity. Although direct evidence is scarce, a substantial literature suggests the phase of slow oscillations during deep sleep to be an important determinant for stimulus processing. Here, we introduce an algorithm for predicting slow oscillations in real-time. Using this approach to present stimuli directed at both oscillatory up and down states, we show neural stimulus processing depends importantly on the slow oscillation phase. During ensuing wakefulness, however, we did not observe differential brain or behavioral responses to these stimulus categories, suggesting no enduring memories were formed. We speculate that while simpler forms of learning may occur during sleep, neocortically based memories are not readily established during deep sleep.
Visual perception starts with localized filters that subdivide the image into fragments that undergo separate analyses. The visual system has to reconstruct objects by grouping image fragments that belong to the same object. A widely held view is that perceptual grouping occurs in parallel across the visual scene and without attention. To test this idea, we measured the speed of grouping in pictures of animals and vehicles. In a classification task, these pictures were categorized efficiently. In an image-parsing task, participants reported whether two cues fell on the same or different objects, and we measured reaction times. Despite the participants' fast object classification, perceptual grouping required more time if the distance between cues was larger, and we observed an additional delay when the cues fell on different parts of a single object. Parsing was also slower for inverted than for upright objects. These results imply that perception starts with rapid object classification and that rapid classification is followed by a serial perceptual grouping phase, which is more efficient for objects in a familiar orientation than for objects in an unfamiliar orientation.
Are locations or colors more effective cues in biasing attention? We addressed this question with a visual search task that featured an associative priming manipulation. The observers indicated which target appeared in a search array. Unknown to them, one target appeared at the same location more often and a second target appeared in the same color more often. Both location and color biases facilitated performance, but location biases benefited the selection of all targets, whereas color biases only benefited the associated target letter. The generalized benefit of location biases suggests that locations are more effective cues to attention.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.