2021
DOI: 10.1109/access.2020.3048877
|View full text |Cite
|
Sign up to set email alerts
|

Online Measuring of Robot Positions Using Inertial Measurement Units, Sensor Fusion and Artificial Intelligence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(2 citation statements)
references
References 46 publications
0
2
0
Order By: Relevance
“…There are three channels in this dataset, which are bone joint position, color, and depth channels [21]. This dataset contains ten movements (stretching, chest expansion, body rotation, jumping, walking, sitting, standing, picking up, throwing, pushing, pulling, waving, and clapping) for ten objects, where it contains two instances, each object performs each action twice, using some of the activities corresponding to the system (stretching, chest expansion, body rotation, and jumping).…”
Section: Utkinect Public Datasetmentioning
confidence: 99%
“…There are three channels in this dataset, which are bone joint position, color, and depth channels [21]. This dataset contains ten movements (stretching, chest expansion, body rotation, jumping, walking, sitting, standing, picking up, throwing, pushing, pulling, waving, and clapping) for ten objects, where it contains two instances, each object performs each action twice, using some of the activities corresponding to the system (stretching, chest expansion, body rotation, and jumping).…”
Section: Utkinect Public Datasetmentioning
confidence: 99%
“…In the hand-eye calibration work, we directly obtained the motion parameters of the robot without compensating for the error disturbance of the robot motion parameters. Using inertial measurement units (IMU) to obtain the position of the robot is a common and effective means [34,35]. In future work, we will try to use IMU to obtain the motion parameters of the robot, which may help to further improve the accuracy of hand-eye calibration.…”
Section: Discussionmentioning
confidence: 99%