The own-race bias (ORB) is a well-known finding wherein people are better able to recognize and discriminate own-race faces, relative to cross-race faces. In 2 experiments, participants viewed Asian and Caucasian faces, in preparation for recognition memory tests, while their eye movements and pupil diameters were continuously monitored. In Experiment 1 (with Caucasian participants), systematic differences emerged in both measures as a function of depicted race: While encoding cross-race faces, participants made fewer (and longer) fixations, they preferentially attended to different sets of features, and their pupils were more dilated, all relative to own-race faces. Also, in both measures, a pattern emerged wherein some participants reduced their apparent encoding effort to cross-race faces over trials. In Experiment 2 (with Asian participants), the authors observed the same patterns, although the ORB favored the opposite set of faces. Taken together, the results suggest that the ORB appears during initial perceptual encoding. Relative to own-race face encoding, cross-race encoding requires greater effort, which may reduce vigilance in some participants.
Younger and older adults' visual scan patterns were examined as they passively viewed younger and older neutral faces. Both participant age groups tended to look longer at their own-age as compared to other-age faces. In addition, both age groups reported more exposure to own-age than other-age individuals. Importantly, the own-age bias in visual inspection of faces and the own-age bias in self-reported amount of exposure to young and older individuals in everyday life, but not explicit age stereotypes and implicit age associations, significantly and independently predicted the own-age bias in later old/new face recognition. We suggest these findings reflect increased personal and social relevance of, and more accessible and elaborated schemas for, own-age than other-age faces.Human faces provide information critical for social interactions. Some of the information extracted from faces (e.g., expression, race, or age) affects how faces are encoded and remembered (Bäckman, 1991;Ebner & Johnson, 2009;Meissner & Brigham, 2001). For instance, people of different ages are more likely to attend to, and are faster and more accurate in recognizing, faces of their own than another age group (Anastasi & Rhodes, 2005;Ebner & Johnson, 2010;Lamont, Stewart-Williams, & Podd, 2005; see Harrison & Hole, 2009, for an overview). There are several factors that may predict the own-age bias in face recognition as discussed below. VISUAL INSPECTION OF OWN-AGE AND OTHER-AGE FACESDifferential attention can be reflected in patterns of looking at faces (Buswell, 1935;Isaacowitz, Wadlinger, Goren, & Wilson, 2006;Knight, Seymour, Gaunt, Baker, Nesmith et al., 2007), and visual scan pattern can affect encoding and recognition of faces (Henderson, Williams, & Falk, 2005). For example, face recognition is impaired when eye movements during face encoding are restricted to the center of a face instead of allowing for free sampling of facial features and their interrelations (Henderson et al., 2005).Younger and older adults differ in how they visually scan faces: Whereas younger adults look more at eyes than mouths, older adults show the reverse pattern on an emotional expression identification task (Murphy & Isaacowitz, 2010;Sullivan, Ruffman, & Hutton, 2007; Wong, Cronin-Golomb, & Neargarder, 2005, but see Ebner, He, & Johnson, in press). But do younger and older adults differently scan faces of their own age group as opposed to faces of the other age group, and if so, do differences in scan pattern predict the own-age bias in later face recognition? To our knowledge, the only study that addressed these © Guilford Publications, Inc. 2011 Correspondence concerning this article should be addressed to Yi He, Natalie C. questions asked younger and older adults to rate the quality of pictures of younger and older faces and to evaluate the age of the faces (Firestone, Turk-Browne, & Ryan, 2007). Under these conditions, there was no indication of an own-age bias in visual inspection of faces. Rather, overall looking time, number of fixati...
We investigated how age of faces and emotion expressed in faces affect young (n = 30) and older (n = 20) adults’ visual inspection while viewing faces and judging their expressions. Overall, expression identification was better for young than older faces, suggesting that interpreting expressions in young faces is easier than in older faces, even for older participants. Moreover, there were age-group differences in misattributions of expressions, in that young participants were more likely to label disgusted faces as angry, whereas older adults were more likely to label angry faces as disgusted. In addition to effects of emotion expressed in faces, age of faces affected visual inspection of faces: Both young and older participants spent more time looking at own-age than other-age faces, with longer looking at own-age faces predicting better own-age expression identification. Thus, cues used in expression identification may shift as a function of emotion and age of faces, in interaction with age of participants.
The neural correlates of the perception of faces from different races were investigated. White participants performed a gender identification task in which Asian, Black, and White faces were presented while event-related potentials (ERPs) were recorded. Participants also completed an implicit association task for Black (IAT-Black) and Asian (IAT-Asian) faces. ERPs evoked by Black and White faces differed, with Black faces evoking a larger positive ERP that peaked at 168 ms over the frontal scalp, and White faces evoking a larger negative ERP that peaked at 244 ms. These Black/White ERP differences significantly correlated with participants' scores on the IATBlack. ERPs also differentiated White from Asian faces and a significant correlation was obtained between the White-Asian ERP difference waves at ~500 ms and the IAT-Asian. A positive ERP at 116 ms over occipital scalp differentiated all three races, but was not correlated with either IAT. In addition, a late positive component (around 592 ms) was greater for the same race compared to either other race faces, suggesting potentially more extended or deeper processing of the same race faces. Taken together, the ERP/IAT correlations observed for both other races indicate the influence of a race-sensitive evaluative process that may include early more automatic and/or implicit processes and relatively later more controlled processes. KeywordsFace perception; race; implicit association test (IAT); Event-related potentials (ERP) Human faces are rich sources of information for guiding interpersonal impressions and interactions (Blair, Judd, Sadler, & Jenkins, 2002;Ekman, 1989;Willis & Todorov, 2006;Young, McWeeny, Hay, & Ellis, 1986). The extraction of information from faces occurs very quickly, and research has shown that an exposure of 100 ms to a face is sufficient to form an impression about a person (Willis & Todorov, 2006). Moreover, characteristics such as race, age, and sex are reliably extracted from facial features. Race is a particularly important characteristic and can elicit spontaneous activation of stereotypes and prejudices (Blair, Judd, & Fallman, 2004;Smith-McLallen, Johnson, Dovidio & Pearson, 2006). Here, we investigated the relation between implicit racial attitudes and the perception of faces from different races, focusing on the time course of this relationship.Several prior studies have investigated the timeline of attending to racial cues from human faces (Ito, Thompson, & Cacioppo, 2004;Ito & Urland, 2003. Recently, studies of Correspondence: Gregory McCarthy, gregory.mccarthy@yale.edu,, Department of Psychology, Yale University, New Haven, CT 06510. Contact information: Yi He, yi.he@yale.edu, NIH Public Access NIH-PA Author ManuscriptNIH-PA Author Manuscript NIH-PA Author Manuscript cross-race face processing have attempted to determine when in time facial features that differentiate racial groups and racial attitudes held by an observer influence the neural processing of faces and what brain structures are sensitive to these v...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.