2019 Chinese Control Conference (CCC) 2019
DOI: 10.23919/chicc.2019.8866311
|View full text |Cite
|
Sign up to set email alerts
|

Facial Expression Recognition using Convolutional Neural Network on Graphs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
2
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 17 publications
(3 citation statements)
references
References 11 publications
0
3
0
Order By: Relevance
“…Wu et al [274] proposed a new method for facial expression recognition using a Graph Convolutional Network (GCN) which is capable of processing non-Euclidean structured data. The methodology entails the construction of an undirected graph from facial images, which is achieved by amalgamating both fixed and random points.…”
Section: Graphsmentioning
confidence: 99%
“…Wu et al [274] proposed a new method for facial expression recognition using a Graph Convolutional Network (GCN) which is capable of processing non-Euclidean structured data. The methodology entails the construction of an undirected graph from facial images, which is achieved by amalgamating both fixed and random points.…”
Section: Graphsmentioning
confidence: 99%
“…Orthognathic surgery has been studied for its effects on both face expression and mental health ([38]). Computer-aided facial expression categorization was given by Wu et al [39]. They employed a convolutional neural network to diagnose normal and abnormal faces.…”
Section: Literature Reviewmentioning
confidence: 99%
“…In the testing phase, only the learned AGCNs are used for final prediction, as shown in Fig. 5 various tasks, such as skeleton-based human action recognition [32,29,39] and facial landmark-based emotion recognition [27,37,38]. In this study, we apply an adaptive GCNs (AGCN) [32], in which the topology of the graph can be learned, on the detected body joints and facial landmarks for seizure classification.…”
Section: Introductionmentioning
confidence: 99%