2016 12th IEEE International Conference on Industry Applications (INDUSCON) 2016
DOI: 10.1109/induscon.2016.7874573
|View full text |Cite
|
Sign up to set email alerts
|

Trajectory tracking control of a mobile robot using lidar sensor for position and orientation estimation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 10 publications
0
4
0
Order By: Relevance
“…The piecewise-linear robot paths consist of a series of tool center positions and a series of unit orientations. Therefore, the robot path smoothing is divided into position path smoothing and orientation path smoothing (Lima et al , 2016; Niu and Tian, 2018). To improve the continuity of position paths, some methods, e.g.…”
Section: Introductionmentioning
confidence: 99%
“…The piecewise-linear robot paths consist of a series of tool center positions and a series of unit orientations. Therefore, the robot path smoothing is divided into position path smoothing and orientation path smoothing (Lima et al , 2016; Niu and Tian, 2018). To improve the continuity of position paths, some methods, e.g.…”
Section: Introductionmentioning
confidence: 99%
“…Meanwhile, low-resolution scanners (Microsoft Kinect V2, Hokuyo and Velodyne LiDAR) are commonly used in real-time applications because they require less time for generating 3D data [3] and are suitable for 3D environment analysis. LiDAR scanners are commonly used in UAVs [4,5], robots [6] and autonomous cars [7]. Hence, laser sensor technology provides accurate geometric information by acquiring the complicated surfaces using various methods.…”
Section: Introductionmentioning
confidence: 99%
“…Yet another practical problem arises when one needs to correctly estimate the pose of the mobile robot. Some works propose strategies that increase localization precision based on vision sensors (Alatise and Hancke, 2017), Global Positioning System (GPS) (Skobeleva et al, 2016), or Light Detection and Ranging (LiDAR) systems (Lima et al, 2016). Most works use the effective sensor fusion techniques and Inertial Measuring Units (IMUs) to combine sensory data from different sensors so that the limitation of individual sensors are compensated.…”
Section: Introductionmentioning
confidence: 99%