2017
DOI: 10.5709/acp-0223-1
|View full text |Cite
|
Sign up to set email alerts
|

Gender Classification Based on Eye Movements: A Processing Effect During Passive Face Viewing

Abstract: Studies have revealed superior face recognition skills in females, partially due to their different eye movement strategies when encoding faces. In the current study, we utilized these slight but important differences and proposed a model that estimates the gender of the viewers and classifies them into two subgroups, males and females. An eye tracker recorded participant’s eye movements while they viewed images of faces. Regions of interest (ROIs) were defined for each face. Results showed that the gender dis… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(16 citation statements)
references
References 26 publications
2
14
0
Order By: Relevance
“…More closely related to our work, several researchers have shown that gender and age can be inferred from eye movements, e.g. by analysing the spatial distribution of gaze on images like faces [Cantoni et al 2015;Sammaknejad et al 2017].…”
Section: Information Available In Eye Movementsmentioning
confidence: 74%
“…More closely related to our work, several researchers have shown that gender and age can be inferred from eye movements, e.g. by analysing the spatial distribution of gaze on images like faces [Cantoni et al 2015;Sammaknejad et al 2017].…”
Section: Information Available In Eye Movementsmentioning
confidence: 74%
“…Indeed, cognitive processes can be observed through eye movements and offer a wealth of information related to internal processes (Itti, 2015;Coutrot, Hsiao, & Chan, 2018). Inference from gaze data consists in deducing subjective characteristics solely from ocular data, such as age (Le Meur et al, 2017b), gender (Coutrot, Binetti, Harrison, Mareschal, & Johnston, 2016;Sammaknejad, Pouretemad, Eslahchi, Salahirad, & Alinejad, 2017), mental states and traits (Liao, Zhang, Zhu, & Ji, 2005;Hoppe, Loetscher, Morey, & Bulling, 2015;Yamada & Kobayashi, 2017;Hoppe, Loetscher, Morey, & Bulling, 2018), expertise and skill proficiency (Eivazi & Bednarik, 2011;Boccignone, Ferraro, Crespi, Robino, & de'Sperati, 2014;Tien et al, 2014;Kolodziej, Majkowski, Francuz, Rak, & Augustynowicz, 2018), and neurological disorders (Kupas, Harangi, Czifra, & Andrassy, 2017;Terao, Fukuda, & Hikosaka, 2017).It has proven useful in identifying autism spectrum disorder (Pierce et al, 2016), fetal alcohol spectrum disorder (Tseng, Paolozza, Munoz, Reynolds, & Itti, 2013), dementia (Zhang et al, 2016;Beltrán, García-Vázquez, Benois-Pineau, Gutierrez-Robledo, & Dartigues, 2018), dyslexia (Benfatto et al, 2016), anxiety (Abbott, Shirali, Haws, & Lack, 2017), mental fatigue (Yamada & Kobayashi, 2017), and other disorders. It has also been applied to task detection (Borji & Itti, 2014;Haji-Abolhassani & Clark, 2014;Kanan, Ray, Bseiso, Hsiao, & Cottrell, 2014;Boisvert & Bruce, 2016).…”
Section: Inference From Gaze Datamentioning
confidence: 99%
“…We selected these two types of models for their dissimilarities as classifiers, which may lead us to learn separate information about our experimental data. Markov models have shown their effectiveness when applied to gaze data (e.g., Simola, Salojärvi, & Kojo, 2008;Kanan, Bseiso, Ray, Hsiao, & Cottrell, 2015;Coutrot et al, 2016;Rai et al, 2016;Sammaknejad et al, 2017). They have been extensively used for modeling time series in general (Camastra & Vinciarelli, 2008).…”
Section: Classifier Modelsmentioning
confidence: 99%
“…It has been pointed out that eye tracking data can even be associated with mental disorders, such as Alzheimer's disease [23], Parkinson's disease [30], and schizophrenia [17]. Furthermore, eye tracking data holds rich personal information, including personality traits [18], gender [50], and user identity [8].…”
Section: Gaze-based Human-computer Interactionmentioning
confidence: 99%