The 23rd Digital Avionics Systems Conference (IEEE Cat. No.04CH37576)
DOI: 10.1109/dasc.2004.1391308
|View full text |Cite
|
Sign up to set email alerts
|

Integration of imaging sensor data into a synthetic vision display

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…Adding SV to this situation through manual control might be intractable. Research (Theunissen, Roefs, Koeners, Rademaker, & Etherington, 2004) found that if the PF had to perform navigation integrity checking using EV and SV information, the situation was unacceptable.…”
Section: Ev and Sv Designs For Pilot Flyingmentioning
confidence: 96%
“…Adding SV to this situation through manual control might be intractable. Research (Theunissen, Roefs, Koeners, Rademaker, & Etherington, 2004) found that if the PF had to perform navigation integrity checking using EV and SV information, the situation was unacceptable.…”
Section: Ev and Sv Designs For Pilot Flyingmentioning
confidence: 96%
“…Figure 4 illustrates the implementation of this concept in the autoland function shown in Figure 2. In [4] this concept has been explored for manned aviation. So, in case of a PDE and/or PEE, the synthetic runway and the real runway do not align.…”
Section: Guidance Integrity Assessmentmentioning
confidence: 99%
“…Based on results from our own research into synthetic vision [10][11][12][13][14][15][16][17][18][19][20] and a review of other research we have identified a significant number of opportunities in the visualization domain to support SA and in the function domain to provide operator support.…”
Section: Synthetic Visionmentioning
confidence: 99%
“…Examples are increasing the field of view, adding symbology to support enhanced visual acquisition, visualization of elements not (yet) available in the sensor image, and guidance augmentation. The basic idea is to support the anticipation of environment features and compensate for effects of sensor limitations such as field of view, effective range and occlusion (caused e.g., by clouds, precipitation, smoke or terrain) 18 .…”
Section: Presentation Domainmentioning
confidence: 99%