29We show that faces contain much more information about sexual orientation than can be 30 perceived and interpreted by the human brain. We used deep neural networks to extract features 31 from 35,326 facial images. These features were entered into a logistic regression aimed at 32 classifying sexual orientation. Given a single facial image, a classifier could correctly distinguish 33 between gay and heterosexual men in 81% of cases, and in 71% of cases for women. Human 34 judges achieved much lower accuracy: 61% for men and 54% for women. The accuracy of the 35 algorithm increased to 91% and 83%, respectively, given five facial images per person. Facial 36 features employed by the classifier included both fixed (e.g., nose shape) and transient facial 37features (e.g., grooming style). Consistent with the prenatal hormone theory of sexual 38 orientation, gay men and women tended to have gender-atypical facial morphology, expression, 39 and grooming styles. Prediction models aimed at gender alone allowed for detecting gay males 40 with 57% accuracy and gay females with 58% accuracy. Those findings advance our 41 understanding of the origins of sexual orientation and the limits of human perception. 42Additionally, given that companies and governments are increasingly using computer vision 43 algorithms to detect people's intimate traits, our findings expose a threat to the privacy and safety 44 of gay men and women. 45