Empathy, theory of mind and imitation, all key building blocks of human social cognition, are thought to rely mechanistically on overlapping representations between the self and the others, but the extent and informational content of such overlap has been difficult to quantify experimentally. Here, we report on a novel psychophysical paradigm in which real photographs of participants' own faces can be manipulated algorithmically to generate arbitrary facial expressions. Using reverse-correlation, we show that we can reconstruct what sensory representations subserved participants' perception of their and another person's smiling face, in a way that can be compared within and across participants. Using this procedure, we show that participants’ mental representations of their own smiling face generally matched the representations they had of others', but were unrelated to how participants actually smiled. Strikingly, similarities between self- and other-representations correlated with participant empathy and alexithymia: participants high on empathy mentally represented others as more similar to themselves, and participants low on alexithymia more accurately represented how they actually smile.