An experimental environment is described for research in remote surveillance and control of several small airports from a central location. Since 2004 a corresponding Remote Tower Operation (RTO) testbed has been realized by the German Aerospace Center at the Braunschweig research airport [1]. Technical as well as human factors aspects of the Remote Airport Traffic Control Center (RTC) are investigated within this environment [2][3]. Because small airports usually lack any electronic surveillance such as ground movement radar the basic philosophy consists in replacement of the controllers direct view out of the tower windows (far view) by a high resolution videopanorama system. This initial design is based on work and task analyses as well as results reported in the literature [4] which indicate the importance of the visual information for present days controller's work procedures. The research environment consists of an experimental part for field testing technical components, verifying specifications and validating operational procedures under live as well as video replay conditions and simulation systems for repeatable experiments with the new work system under controlled conditions, allowing for quantitative evaluation of different technical and operational concepts. The present 180°-video panorama system for live field tests consists of four digital high resolution CCD cameras located near Braunschweig tower, and a remotely controlled pan-tilt zoom (PTZ) camera (including automatic tracking option). The cameras are connected to PC clusters for compression, image processing / movement detection, decompression and panorama reconstruction with 4 + 1 standard high resolution displays. A 450 m fiberoptic Gbit Ethernet link connects sensor and display clusters. Cameras are calibrated for geographically correct superimposition of relevant data for augmenting the (reconstructed) view out of the tower windows with flight data. Field testing of the reconstructed far view with participation of local controllers shows an effective visual resolution of 2 arcmin in agreement with the theoretical predictions. The PTZ camera provides a "foveal" vision component including object tracking options, with a high resolution exceeding the human eye (1 arcmin) within an observation angle < 15°. The replay function allows for detailed reconstruction of specific events. Full day video data (typically 500 GB) are stored on hard drives with 4.5 Tbyte memory. Video-see-through augmentation of the reconstructed outside view, e.g. by means of data from electronic non-visual sources superimposed on the far view of the cameras improves the controllers situational awareness. The augmented vision function also supports a compact RTO-work environment due to reduction of numbers of displays. As an extension of the local field test system a second more distant international airport controlled by DFS (the German ANSP) will be equipped and connected to the experimental system, to provide the framework for long distance and passive mode testing. Tw...
ABSTRACT:During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.