Online social media (OSN) has witnessed a significant growth over past decade. Millions of people now share their thoughts, emotions, preferences, opinions and aesthetic information in the form of images, videos, music, texts, blogs and emoticons. Recently, due to existence of person specific traits in media data, researchers started to investigate such traits with the goal of biometric pattern analysis and recognition. Until now, gender recognition from image aesthetics has not been explored in the biometric community. In this paper, the authors present an authentic model for gender recognition, based on the discriminating visual features found in user favorite images. They validate the model on a publicly shared database consisting of 24,000 images provided by 120 Flickr (image based OSN) users. The authors propose the method based on the mixture of experts model to estimate the discriminating hyperplane from 56 dimensional aesthetic feature space. The experts are based on k-nearest neighbor, support vector machine and decision tree methods. To improve the model accuracy, they apply a systematic feature selection using statistical two sampled t-test. Moreover, the authors provide statistical feature analysis with graph visualization to show discriminating behavior between male and female for each feature. The proposed method achieves 77% accuracy in predicting gender, which is 5% better than recently reported results.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.