Traditional approaches to human information processing tend to deal with perception and action planning in isolation, so that an adequate account of the perception-action interface is still missing. On the perceptual side, the dominant cognitive view largely underestimates, and thus fails to account for, the impact of action-related processes on both the processing of perceptual information and on perceptual learning. On the action side, most approaches conceive of action planning as a mere continuation of stimulus processing, thus failing to account for the goal-directedness of even the simplest reaction in an experimental task. We propose a new framework for a more adequate theoretical treatment of perception and action planning, in which perceptual contents and action plans are coded in a common representational medium by feature codes with distal reference. Perceived events (perceptions) and to-be-produced events (actions) are equally represented by integrated, task-tuned networks of feature codes – cognitive structures we call event codes. We give an overview of evidence from a wide variety of empirical domains, such as spatial stimulus-response compatibility, sensorimotor synchronization, and ideomotor action, showing that our main assumptions are well supported by the data.
This contribution is devoted to the question of whether action-control processes may be demonstrated to influence perception. This influence is predicted from a framework in which stimulus processing and action control are assumed to share common codes, thus possibly interfering with each other. In 5 experiments, a paradigm was used that required a motor action during the presentation of a stimulus. The participants were presented with masked right- or left-pointing arrows shortly before executing an already prepared left or right keypress response. We found that the identification probability of the arrow was reduced when the to-be-executed reaction was compatible with the presented arrow. For example, the perception of a right-pointing arrow was impaired when presented during the execution of a right response as compared with that of a left response. The theoretical implications of this finding as well as its relation to other, seemingly similar phenomena (repetition blindness, inhibition of return, psychological refractory period) are discussed.
When subjects are asked to determine where a fast-moving stimulus enters a window, they typically do not localize the stimulus at the edge, but at some later position within that window (Fröhlich effect). We report five experiments that explored this illusion. An attentional account is tested, assuming that the entrance of the stimulus in the window initiates a focus shift toward it. While this shift is under way, the stimulus moves into the window. Because the first phenomenal (i.e., explicitly reportable) representation of the stimulus will not be available before the end of the focus shift, the stimulus is perceived at some later position. In Experiment 1, we established the Fröhlich effect and showed that it size depends on stimulus parameters such as movement speed and movement direction. In Experiments 2 and 3, we examined the influence of eye movements and tested whether the effect changed when the stimuli were presented within a structural background or when they started from different eccentricities. In Experiments 4 and 5, specific predictions from the attentional model were tested: In Experiment 4 we showed that the processing of the moving stimulus benefits from a preceding peripheral cue indicating the starting position of the subsequent movement, which induces a preliminary focus shift to the position where the moving stimulus would appear. As a consequence the Fröhlich effect was reduced. Using a detection task in Experiment 5, we showed that feature information about the moving stimulus is lost when it falls into the critical interval of the attention shift. In conclusion, the present attentional account shows that selection mechanisms are not exclusively space based; rather, they can establish a spatial representation that is also used for perceptual judgement--that is, selection mechanisms can be space establishing as well.
The judged final position of a moving stimulus has been suggested to be shifted in the direction of motion because of mental extrapolation (representational momentum). However, a perceptual explanation is possible: The eyes overshoot the final position of the target, and because of a foveal bias, the judged position is shifted in the direction of motion. To test this hypothesis, the authors replicated previous studies, but instead of having participants indicate where the target vanished, the authors probed participants' perceptual focus by presenting probe stimuli close to the vanishing point. Identification of probes in the direction of target motion was more accurate immediately after target offset than it was with a delay. Another experiment demonstrated that judgments of the final position of a moving target are affected by whether the eyes maintain fixation or follow the target. The results are more consistent with a perceptual explanation than with a memory account.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.