People perceive and conceive of activity in terms of discrete events. Here we propose a theory according to which the perception of boundaries between events arises from ongoing perceptual processing and regulates attention and memory. Perceptual systems continuously make predictions about what will happen next. When transient errors in predictions arise, an event boundary is perceived. According to the theory, the perception of events depends on both sensory cues and knowledge structures that represent previously learned information about event parts and inferences about actors' goals and plans. Neurological and neurophysiological data suggest that representations of events may be implemented by structures in the lateral prefrontal cortex and that perceptual prediction error is calculated and evaluated by a processing pathway including the anterior cingulate cortex and subcortical neuromodulatory systems.
Recent work on event perception suggests that perceptual processing increases when events change. An important question is how such changes influence the way other information is processed, particularly during dual-task performance. In this study, participants monitored a long series of distractor items for an occasional target as they simultaneously encoded unrelated background scenes. The appearance of an occasional target could have two opposite effects on the secondary task: It could draw attention away from the second task, or, as a change in the ongoing event, it could improve secondary task performance. Results were consistent with the second possibility. Memory for scenes presented simultaneously with the targets was better than memory for scenes that preceded or followed the targets. This effect was observed when the primary detection task involved visual feature oddball detection, auditory oddball detection, and visual color-shape conjunction detection. It was eliminated when the detection task was omitted, and when it required an arbitrary response mapping. The appearance of occasional, task-relevant events appears to trigger a temporal orienting response that facilitates processing of concurrently attended information (Attentional Boost Effect).
Memory for naturalistic events over short delays is important for visual scene processing, reading comprehension, and social interaction. The research presented here examined relations between how an ongoing activity is perceptually segmented into events and how those events are remembered a few seconds later. In several studies participants watched movie clips that presented objects in the context of goal-directed activities. Five seconds after an object was presented, the clip paused for a recognition test. Performance on the recognition test depended on the occurrence of perceptual event boundaries. Objects that were present when an event boundary occurred were better recognized than other objects, suggesting that event boundaries structure the contents of memory. This effect was strongest when an object's type was tested, but was also observed for objects' perceptual features. Memory also depended on whether an event boundary occurred between presentation and test; this variable produced complex interactive effects that suggested that the contents of memory are updated at event boundaries. These data indicate that perceptual event boundaries have immediate consequences for what, when, and how easily information can be remembered.
One way to understand something is to break it up into parts. New research indicates that segmenting ongoing activity into meaningful events is a core component of ongoing perception, with consequences for memory and learning. Behavioral and neuroimaging data suggest that event segmentation is automatic and that people spontaneously segment activity into hierarchically organized parts and sub-parts. This segmentation depends on the bottom-up processing of sensory features such as movement, and on the top-down processing of conceptual features such as actors' goals. How people segment activity affects what they remember later; as a result, those who identify appropriate event boundaries during perception tend to remember more and learn more proficiently.
Substantial research has focused on the allocation of spatial attention based on goals or perceptual salience. In everyday life, however, people also direct attention using their previous experience. Here we investigate the pace at which people incidentally learn to prioritize specific locations. Participants searched for a T among Ls in a visual search task. Unbeknownst to them, the target was more often located in one region of the screen than in other regions. An attentional bias toward the rich region developed over dozens of trials. However, the bias did not rapidly readjust to new contexts. It persisted for at least a week and for hundreds of trials after the target’s position became evenly distributed. The persistence of the bias did not reflect a long window over which visual statistics were calculated. Long-term persistence differentiates incidentally learned attentional biases from the more flexible goal-driven attention.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.