2012 IEEE/RSJ International Conference on Intelligent Robots and Systems 2012
DOI: 10.1109/iros.2012.6385624
|View full text |Cite
|
Sign up to set email alerts
|

Next-best-scan planning for autonomous 3D modeling

Abstract: We present a next-best-scan (NBS) planning approach for autonomous 3D modeling. The system successively completes a 3D model from complex shaped objects by iteratively selecting a NBS based on previously acquired data. For this purpose, new range data is accumulated in-theloop into a 3D surface (streaming reconstruction) and new continuous scan paths along the estimated surface trend are generated. Further, the space around the object is explored using a probabilistic exploration approach that considers sensor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
32
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
7
1
1

Relationship

1
8

Authors

Journals

citations
Cited by 46 publications
(32 citation statements)
references
References 17 publications
0
32
0
Order By: Relevance
“…10 The reader can find a thorough review of view planning methods in the following surveys. 1,9,14,20 Some methods go beyond the NBV calculation and determine the robot state that matches the sensor pose Torabi and Gupta, 10 combines inverse kinematics with a probabilistic road map (PRM), Kriegel et al, 2 uses an RRT to get possible paths, and Monica and Aleotti 21 uses optimal motion planning implemented in the MoveIt! ROS stack.…”
Section: Related Workmentioning
confidence: 99%
“…10 The reader can find a thorough review of view planning methods in the following surveys. 1,9,14,20 Some methods go beyond the NBV calculation and determine the robot state that matches the sensor pose Torabi and Gupta, 10 combines inverse kinematics with a probabilistic road map (PRM), Kriegel et al, 2 uses an RRT to get possible paths, and Monica and Aleotti 21 uses optimal motion planning implemented in the MoveIt! ROS stack.…”
Section: Related Workmentioning
confidence: 99%
“…Existing autonomous scanning systems either employ an articulated robotic arm to perform detailed scanning of a single object [Krainin et al 2011;Kriegel et al 2012;Wu et al 2014], or drive a mobile robot equipped with a fixed camera for exploratory scene mapping [Charrow et al 2015]. In contrast to these works, we employ an eye-in-hand setting on a mobile robot to achieve simultaneous exploration and scanning of complex scenes, which requires joint optimization of robot paths and camera trajectories.…”
Section: Related Workmentioning
confidence: 99%
“…Therefore. previous work on autonomous 3D modeling [20] and object recognition [21] is extended and combined. The autonomous object modeling is extended to multiple objects, and sped up by the recognition of objects that are already known.…”
Section: Scene Explorationmentioning
confidence: 99%
“…The PVS is represented by an octree, where the probability distribution of occupied/free locations is modeled. All measurements from the 3D camera and the laser striper update the probability p v (0.0 free, 0.5 unknown, 1.0 occupied) of each intersected voxel with Bayes' Rule as in [20].…”
Section: A Explorationmentioning
confidence: 99%