2015
DOI: 10.1371/journal.pone.0124414
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection for Wearable Smartphone-Based Human Activity Recognition with Able bodied, Elderly, and Stroke Patients

Abstract: Human activity recognition (HAR), using wearable sensors, is a growing area with the potential to provide valuable information on patient mobility to rehabilitation specialists. Smartphones with accelerometer and gyroscope sensors are a convenient, minimally invasive, and low cost approach for mobility monitoring. HAR systems typically pre-process raw signals, segment the signals, and then extract features to be used in a classifier. Feature selection is a crucial step in the process to reduce potentially larg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
109
0
4

Year Published

2016
2016
2024
2024

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 162 publications
(118 citation statements)
references
References 23 publications
5
109
0
4
Order By: Relevance
“…Capela et al [43] identified seven different meta-classes (or levels) of activities differing by the level of detail: Level 1: Mobile, and immobile (large movements and stairs labeled as mobile; sit, stand, lie, and small movements labeled as immobile); Level 2: Sit, and stand (not including small movements); Level 3: Sit, stand, and lie; Level 4: Large movements (going upstairs); Level 5: Ramp up, ramp down, large movements, stairs up, and stairs down; Level 6: Small movements (e.g., sitting, standing or lying); and Level 7: Transition states (transition between activities).…”
Section: Taxonomy Of Human Activitiesmentioning
confidence: 99%
See 1 more Smart Citation
“…Capela et al [43] identified seven different meta-classes (or levels) of activities differing by the level of detail: Level 1: Mobile, and immobile (large movements and stairs labeled as mobile; sit, stand, lie, and small movements labeled as immobile); Level 2: Sit, and stand (not including small movements); Level 3: Sit, stand, and lie; Level 4: Large movements (going upstairs); Level 5: Ramp up, ramp down, large movements, stairs up, and stairs down; Level 6: Small movements (e.g., sitting, standing or lying); and Level 7: Transition states (transition between activities).…”
Section: Taxonomy Of Human Activitiesmentioning
confidence: 99%
“…Based on the extensive analysis of literature and features used by other authors (especially by Capela et al [43], Mathie et al [53], Zhang and Sawchuk [52]), we have extracted 99 features of data, which have been detailed in [54]. The feature ranking was performed using Kullback-Leibler divergence as class separability criterion on the human activity data from the USC-HAD dataset as described in [54].…”
Section: Featuresmentioning
confidence: 99%
“…A further pre-processing of extracted features is needed to deal with the issue of features heterogeneity before classification. This is done through feature normalisation which is often applied in many machine learning applications (Sung et al 2012;Capela et al 2015). Normalisation of each feature in the activity features matrix obtained in Eq.…”
Section: Feature Normalisationmentioning
confidence: 99%
“…RGB-D sensors) (Sung et al 2011(Sung et al , 2012Han et al 2017) and non-visual sensors (e.g. wearable sensors) (Capela et al 2015) which make it a lot easier to obtain information of activities. Although non-visual sensors have certain advan-…”
Section: Introductionmentioning
confidence: 99%
“…The respective authors proposed two different filter strategies, namely CFS and Relief-f feature selection in recognizing the construction activity. Challita et In other works, Capela et al (2015) used Relief-f, CFS, and fast correlation methods to recognize human activity from three different types of users, namely able-bodied, elderly, and stroke patients. On the other hand, an evolutionary algorithm (EA) also gained an attention among researchers in feature reduction.…”
Section: Related Workmentioning
confidence: 99%