2021
DOI: 10.1109/access.2021.3074260
|View full text |Cite
|
Sign up to set email alerts
|

Performance Evaluation of Optical Motion Capture Sensors for Assembly Motion Capturing

Abstract: The optical motion capture (MoCap) sensor provides an effective way to capture human motions and transform them into valuable data that can be applied to certain tasks, e.g. robot learning from demonstration (LfD). In spite of the wide utilization of optical MoCaps in LfD studies, there are few works that explore their potentiality in small parts robotic assembly. Robot manipulation skill learning from demonstration has gained the attention of researchers recently and robotic 3C (Computer, Communication, and C… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 19 publications
(10 citation statements)
references
References 30 publications
0
10
0
Order By: Relevance
“…Different technologies and solutions have been developed to capture motion, hereafter we analyse some of the most widespread examples in the literature along with some industrial applications. Camera-based systems with infrared (IR) cameras can be used to triangulate the location of retroreflective rigid bodies (markers) attached to the targeted subject ( Nagymáté and Kiss, 2018 ; Chatzitofis et al, 2021 ; Hu et al, 2021 ). In addition, systems based on inertial measurement units (IMU) that track the relative movements of articulated structures have become popular for their versatility ( Vignais et al, 2013 ; Caputo et al, 2018 ; Marín and Marín, 2021 ).…”
Section: Human Monitoring Hardware and Systemsmentioning
confidence: 99%
“…Different technologies and solutions have been developed to capture motion, hereafter we analyse some of the most widespread examples in the literature along with some industrial applications. Camera-based systems with infrared (IR) cameras can be used to triangulate the location of retroreflective rigid bodies (markers) attached to the targeted subject ( Nagymáté and Kiss, 2018 ; Chatzitofis et al, 2021 ; Hu et al, 2021 ). In addition, systems based on inertial measurement units (IMU) that track the relative movements of articulated structures have become popular for their versatility ( Vignais et al, 2013 ; Caputo et al, 2018 ; Marín and Marín, 2021 ).…”
Section: Human Monitoring Hardware and Systemsmentioning
confidence: 99%
“…Human body motion can be tracked by cameras and is mainly concerned with the boundaries or features of human body on the images. Image based systems use computer vision techniques to obtain motion parameters directly from video footage without the use of special markers [5]. By using a proper camera set-up, including a single camera or a distributed-camera configuration, motion capture can be performed.…”
Section: Visual Based Motion Capturementioning
confidence: 99%
“…Reviewing the techniques shows that most of the motion capture techniques are highly dependent on accurate positioning of sensors and require calibration before measurement [3]. In general, body motion sensing systems are sensitive to sensor positioning and bulky with large batteries [5]. Accurate sensor placement with respect to anatomical landmarks is one of the main factors determining the accuracy of motion capture systems [6].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In recent years, it has become an inevitable trend to utilize robots instead of skilled workers to complete precise and complex operations in industrial automation [4,5]. Robotic assembly systems are now widely used in 3C [7,8], automotive [9,10], and aircraft [11,12] assembly lines due to their dexterity, stability, high efficiency and high accuracy [6]. In essence, assembly fits the specific parts (which can be called the assembly features) of two assemblies according to technical requirements.…”
Section: Introductionmentioning
confidence: 99%