Within a fraction of a second of viewing a face, we have already determined its gender, age and identity. A full understanding of this remarkable feat will require a characterization of the computational steps it entails, along with the representations extracted at each. Here we used magnetencephalography to ask which properties of a face are extracted when, and how early in processing these computations are affected by face familiarity. Subjects viewed images of familiar and unfamiliar faces varying orthogonally in gender and age. Using representational similarity analysis, we found that gender and age information emerged significantly earlier than identity information, followed by a late signature of familiarity. Importantly, gender and identity representations were enhanced for familiar faces early during processing. These findings start to reveal the sequence of processing steps entailed in face perception in humans, and suggest that early stages of face processing are tuned to familiar face features.