2003
DOI: 10.1002/ecjb.10136
|View full text |Cite
|
Sign up to set email alerts
|

Estimation of motion parameters for a roaming robot using panoramic images

Abstract: SUMMARYThis paper relates to studying a roaming robot rotating about a vertical axis while moving horizontally on a flat floor surface, and to a method for estimation of its motion parameters. For estimation of motion parameters this method uses correspondences of five or more direction angles in panoramic images captured from three viewpoints by a roaming robot. In this study, first, motion parameters of a roaming robot were limited to horizontal components of translational and rotational parameters, and then… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2004
2004
2004
2004

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 10 publications
0
2
0
Order By: Relevance
“…These techniques are divided into view-based matching techniques [10][11][12][13][14][15][16] and matching techniques based on geometric features (or landmarks) [3,5,[17][18][19]. A view-based matching technique stores features extracted in advance from omnidirectional images at various locations along a learning route, and when estimating the sensor location, compares the features obtained at the current location with the learning image features that were stored in advance to obtain the closest location on the learning route.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…These techniques are divided into view-based matching techniques [10][11][12][13][14][15][16] and matching techniques based on geometric features (or landmarks) [3,5,[17][18][19]. A view-based matching technique stores features extracted in advance from omnidirectional images at various locations along a learning route, and when estimating the sensor location, compares the features obtained at the current location with the learning image features that were stored in advance to obtain the closest location on the learning route.…”
Section: Introductionmentioning
confidence: 99%
“…Li and Tsuji [19] used stereo and color information from panoramic views to extract landmarks and used landmarks to perform robot navigation. In addition, techniques for estimating sensor motion parameters (relative location or pose) from corner tracking [13] or vertical edge matching [3,18] or associations of feature points obtained by using floor edge tracking [5] between frames of a sequence of omnidirectional images have been proposed. Although these techniques can estimate the sensor location by using few feature points or landmarks, the feature points or landmarks must be stably extracted and matched.…”
Section: Introductionmentioning
confidence: 99%