It is well established that the neural activity engaged during memory retrieval varies with the kinds of information that are recovered. Less well established is whether this activity reflects online recovery of information, or processes operating downstream of successful recovery. We used event-related potentials (ERPs) to adjudicate between these alternatives, emphasizing that an online recovery account would be supported if material-specific indices of successful retrieval occurred no later than a material-independent index of recollection, the left-parietal ERP old/new effect. A contrast between ERP correlates of successful memory retrieval for words and for faces revealed material-specific neural activity that onset as early as the left-parietal old/new effect. These findings suggest that the material-specific neural activity indexes the online recovery of encoded information.
The retrieval processes supporting recognition memory for faces were investigated using event-related potentials (ERPs). The focus for analyses was ERP old/new effects, which are the differences between neural activities associated with correct judgments to old (studied) and new (unstudied) test stimuli. In two experiments it was possible to identify three old/new effects that behaved as neural indices of the process of recollection. In both experiments there was one old/new effect that behaved as an index of the process of familiarity. These outcomes are relevant to the ongoing debate about the functional significance of ERP old/new effects and the implications that scalp-recorded electrophysiological data have for theories of the processes supporting long-term memory judgments.
Recognition of faces typically occurs via holistic processing where individual features are combined to provide an overall facial representation. However, when faces are inverted, there is greater reliance on featural processing where faces are recognized based on their individual features. These findings are based on a substantial number of studies using 2-dimensional (2D) faces and it is unknown whether these results can be extended to 3-dimensional (3D) faces, which have more depth information that is absent in the typical 2D stimuli used in face recognition literature. The current study used the face inversion paradigm as a means to investigate how holistic and featural processing are differentially influenced by 2D and 3D faces. Twenty-five participants completed a delayed face-matching task consisting of upright and inverted faces that were presented as both 2D and 3D stereoscopic images. Recognition accuracy was significantly higher for 3D upright faces compared to 2D upright faces, providing support that the enriched visual information in 3D stereoscopic images facilitates holistic processing that is essential for the recognition of upright faces. Typical face inversion effects were also obtained, regardless of whether the faces were presented in 2D or 3D. Moreover, recognition performances for 2D inverted and 3D inverted faces did not differ. Taken together, these results demonstrated that 3D stereoscopic effects influence face recognition during holistic processing but not during featural processing. Our findings therefore provide a novel perspective that furthers our understanding of face recognition mechanisms, shedding light on how the integration of stereoscopic information in 3D faces influences face recognition processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.