2021 International Conference on Computing, Communication, and Intelligent Systems (ICCCIS) 2021
DOI: 10.1109/icccis51004.2021.9397242
|View full text |Cite
|
Sign up to set email alerts
|

Online Human Action Recognition Using Deep Learning for Indoor Smart Mobile Robots

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 16 publications
0
2
0
Order By: Relevance
“…Third, the elbow point in a human silhouette has been calculated by taking the center of shoulder and hand points [58]. The elbow point on a human's arm has been depicted in equation (6).…”
Section: Skeleton Modelingmentioning
confidence: 99%
See 1 more Smart Citation
“…Third, the elbow point in a human silhouette has been calculated by taking the center of shoulder and hand points [58]. The elbow point on a human's arm has been depicted in equation (6).…”
Section: Skeleton Modelingmentioning
confidence: 99%
“…Moreover, human actions recorded with a various sensors, including depth sensors, smartphone sensors, RGB sensors, and others, to perform HAR are usually sensitive to changes in lighting and background clutter. Furthermore, it is impractical to use many cameras to achieve HAR [6]. Thus, with the recent advancement in vision-based technology, depth-based sensors, such as low-cost Kinect, have improved a lot in efficiency and quality.…”
Section: Introductionmentioning
confidence: 99%