2016 International Conference on 3D Imaging (IC3D) 2016
DOI: 10.1109/ic3d.2016.7823448
|View full text |Cite
|
Sign up to set email alerts
|

Body expression recognition from animated 3D skeleton

Abstract: We present a novel and generic framework for the recognition of body expressions using human postures. Motivated by the state of the art from the domain of psychology, our approach recognizes expression by analyzing sequence of pose. Features proposed in this article are computationally simple and intuitive to understand. They are based on visual cues and provide in-depth understanding of body postures required to recognize body expressions. We have evaluated our approach on different databases with heterogene… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
65
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2
2

Relationship

1
7

Authors

Journals

citations
Cited by 33 publications
(65 citation statements)
references
References 21 publications
0
65
0
Order By: Relevance
“…• Karg et al [26]: This feature-based classification method uses PCA to classify gait features related to shoulder, neck, and thorax angles, stride length, and velocity. This method only Method Accuracy Karg et al [26] 39.58% Venture et al [62] 30.83% Crenn et al [13] 66.22% Crenn et al [14] 40.63% Daoudi et al [16] 42.52% Our Method (PEP) 70.04% Table 2: Accuracy: Our method using gait features, including posture and movement features, and SVM classifier achieves an accuracy of 70.04% for emotion classification. We obtain considerable improvement over prior methods.…”
Section: Emotion Classification Accuracymentioning
confidence: 95%
See 1 more Smart Citation
“…• Karg et al [26]: This feature-based classification method uses PCA to classify gait features related to shoulder, neck, and thorax angles, stride length, and velocity. This method only Method Accuracy Karg et al [26] 39.58% Venture et al [62] 30.83% Crenn et al [13] 66.22% Crenn et al [14] 40.63% Daoudi et al [16] 42.52% Our Method (PEP) 70.04% Table 2: Accuracy: Our method using gait features, including posture and movement features, and SVM classifier achieves an accuracy of 70.04% for emotion classification. We obtain considerable improvement over prior methods.…”
Section: Emotion Classification Accuracymentioning
confidence: 95%
“…• Angle: Head tilt is used to distinguish between happy and sad emotions [13,26]. We model this using five scalar values corresponding to the angles: (1) at the neck by the shoulder joints, (2) at the shoulder joints by the neck and the other shoulder, (3) at the neck by the vertical direction and the back, and (4) at the neck by the head and the back.…”
Section: Application: Prediction Of Perceived Emotionmentioning
confidence: 99%
“…A total of 572 motions were recorded by the Vicon system with 18 cameras at 120 Hz. Their data were in fact reused in several approaches: Crenn et al [4] used them for style recognition, Holden et al [15] used them for editing, Yumer and Mitra [8] as well as Holden et al [16] used them for style transfer.…”
Section: How the Data Are Used In Motion Style Approachesmentioning
confidence: 99%
“…The content of this paper has to be seen in the context of animation and its multiple applications such as entertainment (feature films, video games), human computer interaction, education, health, scientific visualization and simulation, video surveillance, etc. [1]- [4]. Moreover, it focuses on approaches dealing with 3D skeletal data of human body.…”
Section: Introductionmentioning
confidence: 99%
“…The F-score is a measure of a test accuracy. Finally, Crenn et al [9] proposed a new set of two-levels 3D descriptors based on psychological studies. They proposed low-level features which are based on visual cues, and the high-level features are statistic operator in order to reduce the feature vector size and compact the information.…”
Section: Related Workmentioning
confidence: 99%