Early models of face processing proposed that facial cues indicating a person's sex, race, and age, were processed separately from variant cues like emotional expression. Additionally, early theories of emotion perception suggested that processing of emotional expressions was unaffected by the situations in which the expressions were encountered. Subsequent research has demonstrated that this is not the case. The processing of emotional expressions is influenced by a range of contextual factors as well as by other social category cues present on the face. However, the manner in which the multiple sources of information on a face are integrated and how this integration is influenced by situational factors is not yet fully understood. Thus, the overall aim of this thesis was to extend our understanding of the influence of higher order cognitive states on the interaction of facial cues and social categories in the face, specifically in the processing of emotional expressions.This thesis describes a range of investigations, across a range of methods (including affective priming, visual search, and categorization) of how explicitly and implicitly activated higher order cognitive states influence the interaction of multiple facial cues and categories. Higher order cognitive states were manipulated explicitly by instructing participants to focus on different kinds of information available within the face. Higher order cognitive states were elicited implicitly by altering the other faces seen at the same time as the target face, altering the other faces seen on different trials within the same task, or altering the other faces seen in recently completed tasks.Using the affective priming method, Chapter 2 demonstrated that implicit evaluations were more strongly influenced by the emotional expression displayed on the face primes than the social category, including race, sex, and age of the face. An influence of the social category was only observed when participants were instructed to focus on this dimension. This demonstrated that the nature of the task can influence the way in which cues like race, sex, age and emotion interact.Using the visual search paradigm, Chapter 3 demonstrated that the nature of the background faces in a visual search task alters which expressions are more quickly detected. Happy faces were detected faster in backgrounds made up of a range of different emotional faces whereas angry faces tended to be detected faster in homogenous backgrounds made up of faces expressing the same emotion.In Chapter 4.1, it was shown that the way facial cues of race influence the categorization of happy and angry emotional expressions depended on the presentation duration, stimulus type and importantly the number of different faces presented within a task. Chapter 4.2 investigated whether this finding could be accounted for with the perceptual load hypothesis and found it to be an unlikely explanation for the different patterns of results observed at small and large set.Finally, Chapter 5 demonstrated that ...