2013
DOI: 10.5772/54603
|View full text |Cite
|
Sign up to set email alerts
|

A Complete Observability Analysis of the Planar Bearing Localization and Mapping for Visual Servoing with Known Camera Velocities

Abstract: This paper presents an analysis of planar bearing localization and mapping for visual servoing with known camera velocities. In particular, we investigate what is the subset of camera locations and environmental features that can be retrieved from dynamic observations obtained by a planar bearing sensor (nearly e.g., a pinhole camera). Results assume that the camera’s linear and angular velocities are available, which is equivalent to consider a unicycle vehicle carrying an onboard camera. Results hold if othe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
15
0

Year Published

2015
2015
2021
2021

Publication Types

Select...
5
2
1

Relationship

4
4

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 17 publications
0
15
0
Order By: Relevance
“…Given a dynamical system with some outputs (the available measurements), a first step is to establish whether the observation problem, which consists in finding an estimation of the true (but unknown) state of the robot/environment from the knowledge of the inputs and the outputs over a period of time, admits a solution [5], [6]. Differently from linear systems, in the nonlinear case state observability may also depend on the chosen inputs and, in some cases, one can show the existence of singular inputs that do not allow at all the reconstruction of the whole state [7]. A relevant problem is hence to consider some level of active sensing/perception in the control strategies of autonomous robots [8].…”
Section: Introductionmentioning
confidence: 99%
“…Given a dynamical system with some outputs (the available measurements), a first step is to establish whether the observation problem, which consists in finding an estimation of the true (but unknown) state of the robot/environment from the knowledge of the inputs and the outputs over a period of time, admits a solution [5], [6]. Differently from linear systems, in the nonlinear case state observability may also depend on the chosen inputs and, in some cases, one can show the existence of singular inputs that do not allow at all the reconstruction of the whole state [7]. A relevant problem is hence to consider some level of active sensing/perception in the control strategies of autonomous robots [8].…”
Section: Introductionmentioning
confidence: 99%
“…The algorithm exploits the angles measured by each node between the optical axis of the camera and the line projecting the target point on the respective image plane, i.e. bearing only visual servoing [24]. In spite of the measurement and positioning uncertainty, the drones tend to move equally spaced over a circumference centered in the estimated position of the target.…”
Section: Introductionmentioning
confidence: 99%
“…In Ref. [21], it is shown that only three landmarks are required for setting the metric scale in estimates. Nevertheless, in practice, there is often a chance that a visual feature is lost during the tracking process.…”
Section: Prediction-update Loop and Map Managementmentioning
confidence: 99%
“…(3).In order to correctly operate the proposed method, a minimum number of map features must be maintained inside the camera field of view. For example in Ref [21],. it is stated that a minimum number of three features are required for the operation of monocular SLAM.…”
mentioning
confidence: 99%