2022
DOI: 10.3390/s22134755
|View full text |Cite
|
Sign up to set email alerts
|

Semi-Supervised Adversarial Learning Using LSTM for Human Activity Recognition

Abstract: The training of Human Activity Recognition (HAR) models requires a substantial amount of labeled data. Unfortunately, despite being trained on enormous datasets, most current models have poor performance rates when evaluated against anonymous data from new users. Furthermore, due to the limits and problems of working with human users, capturing adequate data for each new user is not feasible. This paper presents semi-supervised adversarial learning using the LSTM (Long-short term memory) approach for human act… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 57 publications
0
7
0
Order By: Relevance
“…Concurrent activity recognition, interleaved activity recognition, and the recognition average are the three sections that make up the analysis section. The general recognition accuracy is then contrasted with the other state-of-the-art approaches, including the CNN [ 44 ], the long short-term memory (LSTM) [ 45 ], and synchronized long short-term memory (Syn-LSTM) [ 46 ].…”
Section: Activity Recognition Results Analysis and Evaluationmentioning
confidence: 99%
“…Concurrent activity recognition, interleaved activity recognition, and the recognition average are the three sections that make up the analysis section. The general recognition accuracy is then contrasted with the other state-of-the-art approaches, including the CNN [ 44 ], the long short-term memory (LSTM) [ 45 ], and synchronized long short-term memory (Syn-LSTM) [ 46 ].…”
Section: Activity Recognition Results Analysis and Evaluationmentioning
confidence: 99%
“…Some examples of traditional machine learning were K-Nearest Neighbours (KNN) [18,25,26], Random Forest (RF) [26,27], Support Vector Machine (SVM) [18,26], Multi-Layer Perceptron [28], and AdaBoost [29]. While deep learning algorithms, such as CNN [19], LSTM [30], RNN [31], and others. Previous research showed [32,33], that some machine learning algorithms produced different performances, even with the same dataset.…”
Section: Related Workmentioning
confidence: 99%
“…Despite the irreplaceable advantages of traditional feature-based machine learning suggested in Section 2.2 , deep learning is increasingly demonstrating its powerful adaptive capabilities. Besides [ 12 ], this Special Issue contains three more articles on deep learning [ 13 , 14 , 15 ], offering us multiple dimensions of thinking: The training of HAR models requires a large amount of annotated data corpus. Most current models are not robust when facing anonymized data from new users; meanwhile, capturing each new subject’s data is usually not possible.…”
Section: Overview Of the Contributionsmentioning
confidence: 99%
“…Most current models are not robust when facing anonymized data from new users; meanwhile, capturing each new subject’s data is usually not possible. Yang et al described semi-supervised adversarial learning using the long-short term memory (LSTM) approach for HAR [ 13 ], which trains labeled and anonymous data by adapting the semi-supervised learning paradigms on which adversarial learning capitalizes to enhance the learning capabilities handling errors. The device-free, privacy-protected, and light-insensitive characteristics have pushed WIFI-based HAR technology into the limelight.…”
Section: Overview Of the Contributionsmentioning
confidence: 99%