2010 IEEE International Conference on Robotics and Biomimetics 2010
DOI: 10.1109/robio.2010.5723573
|View full text |Cite
|
Sign up to set email alerts
|

Vision and laser sensor data fusion technique for target approaching by outdoor mobile robot

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2011
2011
2014
2014

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 5 publications
0
5
0
Order By: Relevance
“…In this case, the pitch and yaw angles of the twoaxis camera system are obtain by (6) and (7). The aiming trajectories are shown in Fig.…”
Section: Control Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this case, the pitch and yaw angles of the twoaxis camera system are obtain by (6) and (7). The aiming trajectories are shown in Fig.…”
Section: Control Resultsmentioning
confidence: 99%
“…The vision system can not only track suspicious targets but also display the real-time image for observation. In [5] and [6], the vision sensors are used for mobile robots outdoor navigation. There are other researchers used vision systems for object tracking, obstacle avoidance or patrolling.…”
Section: Introductionmentioning
confidence: 99%
“…To acquire this perpendicular trajectory, we used a dual sensor fusion technique [16] whereby data from the camera and LRS are fused to generate the path. Basically, the technique involves first getting the approximate location of a button box using vision, then using approximate location to acquire a high precision location from the LRS.…”
Section: B 2 Navigation Towards Button Box Using Vision and Lrsmentioning
confidence: 99%
“…We reported in [16] that the robot approaches the button box with high precision due to the fact that the perpendicular trajectory is constantly being updated as the robot moves nearer and secondly, the accuracy of the LRS also improves as the distance reduces. According to [18], the URG-04LX has an error of ±2% at distances of 4m but reduces to ±10mm at 1m.…”
Section: Button Activation With Simple Fingermentioning
confidence: 99%
“…2D laser rangefinder and camera is combined successfully on ground robots. Pedestrian detection system was developed based on particle filter which fused data from laser and camera [8], also the same combination of sensor was developed for target approaching detection [9] on ground robots. Comparing with ground robot, the main challenge for quadrotor target localization is, as illustrated at the beginning, the destined unstable flight dynamics and absence of accurate odometry, not to mention quadrotor flies in 3D space with 6 degrees of freedoms.…”
Section: Introductionmentioning
confidence: 99%