2015
DOI: 10.1016/j.ast.2014.09.019
|View full text |Cite
|
Sign up to set email alerts
|

Implicit observation model for vision aided inertial navigation of aerial vehicles using single camera vector observations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
10
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(10 citation statements)
references
References 17 publications
0
10
0
Order By: Relevance
“…Camera frame and pixel coordinate. Now, the 3D feature position, two horizontal components, and height, can be obtained using equations (15) and (16). First, the height of the featureĥ n is obtained using p n, kÀ1 in equation (15) and p n, k in equation (16) aŝ…”
Section: Measurement Modelmentioning
confidence: 99%
See 2 more Smart Citations
“…Camera frame and pixel coordinate. Now, the 3D feature position, two horizontal components, and height, can be obtained using equations (15) and (16). First, the height of the featureĥ n is obtained using p n, kÀ1 in equation (15) and p n, k in equation (16) aŝ…”
Section: Measurement Modelmentioning
confidence: 99%
“…using equations (13) to (16), a transformation from the feature error to the vehicle error can be made.…”
Section: First-order Linearized Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…Increasing number of practical utilizations, such as vision based robotics control [58,71,80,108,111,122], surveillance [3,21,25,38,41,89,94,125], navigation [27,32,46,53,70,90,100,121], and electronics in consumer related fields [9,40,49,65,84,113,119], provide applied scenarios for the vision based computation. Most of these applications can be technically attributed into several typical technologies with respect to vision based computation.…”
Section: Introductionmentioning
confidence: 99%
“…Deep-space autonomous navigation can achieve tasks, such as orbit determination and control, attitude orientation, and target tracking, in the event of loss of ground communication; thus, it greatly enhances the survivability of the spacecraft [1]. Autonomous navigation technology used in several representative missions and plans of deep space exploration shows that the current mainstream autonomous navigation technology of deep space exploration is based on the image information of navigation star or target celestial body obtained by optical autonomous navigation (OPNAV) sensor [2], [3].…”
Section: Introductionmentioning
confidence: 99%