2018
DOI: 10.1007/s10586-018-2474-7
|View full text |Cite
|
Sign up to set email alerts
|

Mobile robot positioning method based on multi-sensor information fusion laser SLAM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 11 publications
0
5
0
Order By: Relevance
“…Among them, the centralized state fusion is the same as the centralized observation fusion algorithm; that is, the observation equations of all local sensors in the art design are merged into an augmented observation equation and then combined with the state equation to obtain a centralized global filter and finally obtain the global optimum untie. The distributed state fusion of art design is used to optimally weight the local filters corresponding to each sensor to obtain the distributed global filter and calculate it based on the local filter estimation error covariance matrix [20]. Multisensor data fusion will have a more sensitive response to the art design in the sampling set.…”
Section: Results Analysismentioning
confidence: 99%
“…Among them, the centralized state fusion is the same as the centralized observation fusion algorithm; that is, the observation equations of all local sensors in the art design are merged into an augmented observation equation and then combined with the state equation to obtain a centralized global filter and finally obtain the global optimum untie. The distributed state fusion of art design is used to optimally weight the local filters corresponding to each sensor to obtain the distributed global filter and calculate it based on the local filter estimation error covariance matrix [20]. Multisensor data fusion will have a more sensitive response to the art design in the sampling set.…”
Section: Results Analysismentioning
confidence: 99%
“…As shown in figure 2(a), the camera is horizontally placed on the left arm of the rotating platform. It is specified that the origin of the world coordinate frame O w is in the rotation axis of the scanner, the optical center of the depth image is the origin of the world coordinate frame, the two arms of the rotating platform are parallel to the X w axis of the world coordinate frame, and the light plane is vertical to the Z w O w X w plane of the world coordinate frame [7].…”
Section: System Principlementioning
confidence: 99%
“…Simultaneous localization and mapping (SLAM) [1] refers to the technology that the robot is placed in an unknown scene, and the robot can update the environmental map in real time through the equipped sensors, determine its own posture and locate independently. At present, the mainstream slams are divided into the following four categories: laser slam [2][3]; Visual slam [4][5]; Slam of multi-sensor fusion [6][7]; Slam combined with deep learning [8][9]. Visual slam has become a research hotspot because of its low cost and rich environmental information.…”
Section: Introductionmentioning
confidence: 99%