2019
DOI: 10.3390/sym11101264
|View full text |Cite
|
Sign up to set email alerts
|

Improving Human Motion Classification by Applying Bagging and Symmetry to PCA-Based Features

Abstract: This paper proposes a method for improving human motion classification by applying bagging and symmetry to Principal Component Analysis (PCA)-based features. In contrast to well-known bagging algorithms such as random forest, the proposed method recalculates the motion features for each “weak classifier” (it does not randomly sample a feature set). The proposed classification method was evaluated on a challenging (even to a human observer) motion capture recording dataset of martial arts techniques performed b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2019
2019
2021
2021

Publication Types

Select...
3
1

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(9 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…There is no official name for this approach; therefore, the authors may use various names for it, such as 'eigensequences' (Bottino, De Simone & Laurentini, 2007), 'signatures' (Billon, Nédélec & Tisseau, 2008) or they do not specifically mention its name at all. MoCap data that is used as an input for PCA is most often either three dimensional trajectories of body joints (Billon, Nédélec & Tisseau, 2008;Zago et al, 2017;Ko, Han & Newell, 2018;Choi, Ono & Hachimura, 2009) or angle-based features derived from the positions of those joints (Bottino, De Simone & Laurentini, 2007;Mantovani, Ravaschio, Piaggi & Landi, 2010;Choi, Sekiguchi & Hachimura, 2009;Choi, Sekiguchi & Hachimura, 2013;Das, Wilson, Lazarewicz & Finkel, 2006;Świtoński et al, 2011;Lee, Roan & Smith, 2009;Hachaj & Ogiela, 2018;Hachaj, 2019).…”
Section: Principal Components Of Analysis-based Methods In Human Action Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…There is no official name for this approach; therefore, the authors may use various names for it, such as 'eigensequences' (Bottino, De Simone & Laurentini, 2007), 'signatures' (Billon, Nédélec & Tisseau, 2008) or they do not specifically mention its name at all. MoCap data that is used as an input for PCA is most often either three dimensional trajectories of body joints (Billon, Nédélec & Tisseau, 2008;Zago et al, 2017;Ko, Han & Newell, 2018;Choi, Ono & Hachimura, 2009) or angle-based features derived from the positions of those joints (Bottino, De Simone & Laurentini, 2007;Mantovani, Ravaschio, Piaggi & Landi, 2010;Choi, Sekiguchi & Hachimura, 2009;Choi, Sekiguchi & Hachimura, 2013;Das, Wilson, Lazarewicz & Finkel, 2006;Świtoński et al, 2011;Lee, Roan & Smith, 2009;Hachaj & Ogiela, 2018;Hachaj, 2019).…”
Section: Principal Components Of Analysis-based Methods In Human Action Classificationmentioning
confidence: 99%
“…A classifier is applied after features calculation, for example, support vector machine (SVM) (Das, Wilson, Lazarewicz & Finkel, 2006;Zago et al, 2017;Hachaj, 2019) or the k-nearest neighbour method (Hachaj, 2019).…”
Section: Principal Components Of Analysis-based Methods In Human Action Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…As we already mentioned in Section 1.2 PCA-based features are among most popular approaches used in MoCap data analysis and classification. That approach is described for example in paper [55] so we will summarize only the basic concept.…”
Section: Head Gestures Recognition With Pca-based Featuresmentioning
confidence: 99%
“…Recent developments in the healthcare industry help patients, especially the elderly, to avoid illness, accidents and disease [ 1 ]. Such strategies have introduced monitoring devices such as wearable, vision and marker-based sensors that secure, examine and improve human life in uncertain situations [ 2 , 3 ] while patients remain mobile. Wearable technology has replaced traditional diagnostics by delivering ubiquitous access to vital patient data via smartphones and wearable sensory clothing [ 4 , 5 ].…”
Section: Introductionmentioning
confidence: 99%