2018
DOI: 10.20944/preprints201810.0343.v1
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Fusion of Enhanced and Synthetic Vision System Images for Runway and Horizon Detection

Abstract: UAV network operation enables gathering and fusion from disparate information 10 sources for flight control in both manned and unmanned platforms. In this investigation, a novel 11 procedure for detecting runways and horizons as well as enhancing surrounding terrain is 12 introduced based on fusion of enhanced vision system (EVS) and synthetic vision system (SVS) 13 images. EVS and SVS image fusion has yet to be implemented real-world situations due to signal 14 misalignment. We address this through a registra… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(5 citation statements)
references
References 0 publications
0
5
0
Order By: Relevance
“…Author Name: Preparation of Papers for IEEE Access (February 2017) 8 VOLUME XX, 2017 the initial offset of the UPPV test platform is large owing to the error in GPS navigation when it enters the visual landing phase. However, the offset can be corrected to 0 within 15s with the effect of the autonomous landing visual positioning system, and the comprehensive recognition accuracy of the landing runway was higher than 97.47% for each test during the entire visual landing process.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Author Name: Preparation of Papers for IEEE Access (February 2017) 8 VOLUME XX, 2017 the initial offset of the UPPV test platform is large owing to the error in GPS navigation when it enters the visual landing phase. However, the offset can be corrected to 0 within 15s with the effect of the autonomous landing visual positioning system, and the comprehensive recognition accuracy of the landing runway was higher than 97.47% for each test during the entire visual landing process.…”
Section: Discussionmentioning
confidence: 99%
“…Nazir et al [7] acquired landing runway images using an airborne camera, and estimated the exact location of the landing runway using edge detection algorithms. Fadhil et al [8] proposed a sensor fusion algorithm based on the Hough transform to detect runways. Zhang et al [9] designed a fuzzy Canny edge extraction algorithm and combined it with a curve-fitting method to obtain the centerline of the runway.…”
Section: Introductionmentioning
confidence: 99%
“…However, landing is still one of the most accident-prone ight phases, with a relatively high percentage of fatal and nonfatal air accidents [1,2]. It has been reported that almost half of plane crashes occur in the approach and nal landing stages [3]. If visibility is limited, the pilot will land using avionics and precision instrument landing systems.…”
Section: Introductionmentioning
confidence: 99%
“…However, it did not implement the accommodation cue in visual cues. Other studies have combined synthetic vision systems (SVS) and enhanced flight vision systems (EFVS) [3]. Artificial vision data are utilized to identify dynamic objects on the runway surface [22].…”
Section: Introductionmentioning
confidence: 99%
“…In [ 35 ], the authors present a family of edge detectors for various orientations. Beyond line feature detection, the registration of a synthetic image with known 6D pose [ 36 , 37 ] and registration of real images from database [ 38 ] are also applied to solve the runway detection problem.…”
Section: Introductionmentioning
confidence: 99%