2023
DOI: 10.3390/s23177363
|View full text |Cite
|
Sign up to set email alerts
|

Intelligent Localization and Deep Human Activity Recognition through IoT Devices

Abdulwahab Alazeb,
Usman Azmat,
Naif Al Mudawi
et al.

Abstract: Ubiquitous computing has been a green research area that has managed to attract and sustain the attention of researchers for some time now. As ubiquitous computing applications, human activity recognition and localization have also been popularly worked on. These applications are used in healthcare monitoring, behavior analysis, personal safety, and entertainment. A robust model has been proposed in this article that works over IoT data extracted from smartphone and smartwatch sensors to recognize the activiti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(1 citation statement)
references
References 73 publications
0
1
0
Order By: Relevance
“…Here it is important to note that after optimization, we got two feature vectors, one for localization activities and the second for locomotion activities. We plotted two feature vectors the original versus optimized for Walking, Sitting, and Lying activities using only a few features including ( Alazeb et al, 2023 ), FFT-Min/Max, Shannon entropy, and Kurtosis over the Extrasensory dataset in Figure 10 . The transformation is defined as …”
Section: Methodsmentioning
confidence: 99%
“…Here it is important to note that after optimization, we got two feature vectors, one for localization activities and the second for locomotion activities. We plotted two feature vectors the original versus optimized for Walking, Sitting, and Lying activities using only a few features including ( Alazeb et al, 2023 ), FFT-Min/Max, Shannon entropy, and Kurtosis over the Extrasensory dataset in Figure 10 . The transformation is defined as …”
Section: Methodsmentioning
confidence: 99%