Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction 2020
DOI: 10.1145/3371382.3378304
|View full text |Cite
|
Sign up to set email alerts
|

Ergonomic Adaptation of Robotic Movements in Human-Robot Collaboration

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 9 publications
(4 citation statements)
references
References 6 publications
0
4
0
Order By: Relevance
“…However, they only collect and classify design guidelines and prerequisites according to standards, research works, and real use cases, rather than develop a structured design method. Other existing methods deal with the adaptation of the robot's movements to improve the operator's ergonomic condition (Van Den Broek and Moeslund 2020) or reduce human fatigue (Peternel et al 2018).…”
Section: Design Methods For Hrcmentioning
confidence: 99%
“…However, they only collect and classify design guidelines and prerequisites according to standards, research works, and real use cases, rather than develop a structured design method. Other existing methods deal with the adaptation of the robot's movements to improve the operator's ergonomic condition (Van Den Broek and Moeslund 2020) or reduce human fatigue (Peternel et al 2018).…”
Section: Design Methods For Hrcmentioning
confidence: 99%
“…The analysis of human body pose augments ergonomic assessment, handover chores, and bolstering efficacy and safety in HRI and HRC. [42][43][44] In contrast, hand gesture recognition finds utility in scenarios such as collaborative manipulation, robot programming, surgical robot teleoperation, and beyond. [45][46][47][48][49][50][51][52] Table 1.…”
Section: Human Posementioning
confidence: 99%
“…Body detection MobileNet-SSD; [18] Openpose þ SVM; [19] Bayesian Siamese neural network þ CVAE; [20] YOLO þ Bayesian DNN [21] Human following and autonomous navigation; Visual tracking for autonomous robots tasked with humans and environment interaction; Safe HRI and HRC Face recognition SSD þ FaceNetþKCF; [22] SFPD [23] Human following and autonomous navigation; Simultaneous face and person detection for real-time HRI Human activity Activity recognition Two-stream CNN; [24] 3D LRCN þ 3D CNN þ LSTM; [25] LSTM þ VAE þ DRL; [26] 3D-CNN; [27] STJ-CNN; [28] TCN [29] Collaborative assembly and packaging; Safe HRI and HRC; Companion robots; HRI and VR applications Intention prediction CNN; [30] ILSTM þ IBi-LSTM; [31] CNN þ VMM [32] Surveillance; Collaborative assembly Motion prediction RSSAC-Trajectronþþ; [33] RNN; [34,35] VAE; [36] CVAE þ LSTM; [37] Dynamic motion projection; [38] RNN þ RIMEDNet [39] Safe and efficient HRI and HRC; Collaborative manipulation and assembly; Human imitation; Social HRI; Handover tasks Attention estimation ANN; [40] LSTM [41] Attention level estimation; Blind 3D human attention inference Human pose Body pose recognition OpenPose þ Angle-based rules; [42] Fast-SCNN þ REDE; [43] PoseNet [44] Ergonomics in HRC; Handover task; Efficient and safe HRI and HRC…”
Section: Human Positionmentioning
confidence: 99%
“…For instance, recognizing the action "wear a shoe" from the NTU dataset can be time-consuming [19]. This latency impedes applications such as human-robot interaction, where swift anticipatory behavior is crucial for seamless cooperation [4], [20].…”
Section: Introductionmentioning
confidence: 99%