2022
DOI: 10.1016/j.jfranklin.2022.03.005
|View full text |Cite
|
Sign up to set email alerts
|

Autonomous dynamic docking of UAV based on UWB-vision in GPS-denied environment

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1
1

Relationship

1
6

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 18 publications
0
5
0
Order By: Relevance
“…Compared with relative positioning based on visual search, the method here has wider-range measurements, shorter searching times and requires fewer computing resources. Compared with [29,30], this study considers a dynamic random target. Compared with [28], we consider the measurement error and use historical data for estimation in control.…”
Section: Remarkmentioning
confidence: 99%
See 2 more Smart Citations
“…Compared with relative positioning based on visual search, the method here has wider-range measurements, shorter searching times and requires fewer computing resources. Compared with [29,30], this study considers a dynamic random target. Compared with [28], we consider the measurement error and use historical data for estimation in control.…”
Section: Remarkmentioning
confidence: 99%
“…Subsequently, the position estimation error, q := q − q, is defined, and the root mean square (RMS) and standard deviation (SD) of the positioning error are made for the navigation and landing phases, respectively. In order to compare the superiority of MIFG, we perform a comparison with the forgetting factor least squares method [30] ((FFLS) where the forgetting factor is taken as 0.9) in the navigation phase, as shown in Tables 1 and 2. Through the analysis of RMS and SD, it can be seen that, during the navigation phase, both algorithms can estimate relative position data that meet the task of imprecise navigation.…”
Section: Simulationmentioning
confidence: 99%
See 1 more Smart Citation
“…Machine vision positioning plays a crucial role in the autonomous flight and docking process, improving their positioning accuracy and safety. Cheng et al [19] proposed a method that combines ultra-wideband and visual positioning, and successfully guided UAVs to land in the target area without GPS. Ma et al [20] proposed a ground stereo vision guidance method that simulates human vision when studying UAV landing without GPS.…”
Section: Introductionmentioning
confidence: 99%
“…A large amount of work has been conducted on UWB and vision-guided landing in recent years. A UWB-vision combined autonomous landing framework [15][16][17] was developed that uses UWB technology to provide relative localization and vision localization to guide the final precision landing, but it is hard to provide high-frequency and real-time localization information due to the low update rate of the UWB system. To improve reliability and consistency, a UWB-IMU fusion relative location algorithm was implemented in autonomous landing [18]; however, without guided vision in the final descending stage, the landing precision is hard to guarantee.…”
Section: Introductionmentioning
confidence: 99%