2016
DOI: 10.1587/transinf.2015edp7082
|View full text |Cite
|
Sign up to set email alerts
|

Quantitative Assessment of Facial Paralysis Based on Spatiotemporal Features

Abstract: SUMMARYFacial paralysis is a popular clinical condition occurring in 30 to 40 patients per 100,000 people per year. A quantitative tool to support medical diagnostics is necessary. This paper proposes a simple, visual and robust method that can objectively measure the degree of the facial paralysis by the use of spatiotemporal features. The main contribution of this paper is the proposal of an effective spatiotemporal feature extraction method based on a tracking of landmarks. Our method overcomes the drawback… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 11 publications
(16 reference statements)
0
1
0
Order By: Relevance
“…In related studies, various approaches like landmarks [4,6], facial action units [7,8], eye movement features [9][10][11], and convolutional neural networks (CNN) [12][13][14] are used in the feature extraction stage from the FP images and videos. This paper uses novel approaches to detect FP and its grade by combining multiple facial features (landmarks, facial action units, and eye movement features) to achieve better FP detection.…”
Section: Introductionmentioning
confidence: 99%
“…In related studies, various approaches like landmarks [4,6], facial action units [7,8], eye movement features [9][10][11], and convolutional neural networks (CNN) [12][13][14] are used in the feature extraction stage from the FP images and videos. This paper uses novel approaches to detect FP and its grade by combining multiple facial features (landmarks, facial action units, and eye movement features) to achieve better FP detection.…”
Section: Introductionmentioning
confidence: 99%