The operation of helicopters on ships is one to most challenging tasks due to adverse weather conditions, the lack of visible cues, turbulent airwakes behind the ship and a moving confined landing spot on the ship. Currently, only a very limited number of pilot assistance systems are available to ease helicopter ship deck landings. The focus of this paper is the evaluation of a Head-DownDisplay(HDD),aHead-MountedDisplay(HMD)andtwodifferentAttitudeCommand Attitude Hold (ACAH) flight control architectures for ship deck landings based on piloted simulation. A ship deck landing scenario at the research flight simulation facility Air Vehicle Simulator (AVES) has been extended to include turbulent ship airwakes from high-fidelity Computational Fluid Dynamics (CFD). The pilot assistance systems have been implemented at the simulator and evaluated by four helicopter pilots. In particular, the results show a favorable potential of the Head-Mounted Display and the flight control architectures.
Most accidents and serious incidents of commercial air transport helicopters occur during standard flight phases, whereby a main cause are pilots' situational awareness. Enabling pilots to better assess their situational awareness can make an important contribution in reducing the risk of fatal accidents. One approach is to examine pilot's gaze behavior with the help of eye tracking. This paper reports the results of eye tracking measurement during real flight and simulator studies of a standard mission profile. The general gaze behavior is characterized by a dominant, external view and the airspeed and altitude indicator as the most important flight instruments. A real-world applicability of gaze data obtained in the simulator could be shown.
In modern helicopters highly accurate flight state data is available through precise sensors and complex fusion algorithms in the aircraft. This data is used by the avionic systems and transported to the pilot through sophisticated human machine interfaces (HMI). Especially in helicopters, the pilot is in most flight phases directly involved in the control of the aircraft and uses the information provided by the HMI. However, in most civil aircraft no sensor systems exist to get an insight about the pilots physiological state. Even in experimental helicopters there is no real-time information about the pilots mental state and this information is typically gathered through questionnaires in the debriefing. One way to close this gap is to use physiological measurement systems such as eye tracking. In our work we integrated an eye tracking system into the experimental system of DLR's research simulator AVES and the research helicopter ACT/FHS. In this paper we describe the first steps, which include the selection of the systems, technical aspects of the hardware and software integration process and first experiments in DLR's Bo 105 helicopter and DLR's AVES simulator. Details of our developed toolchain for the live data conditioning are given and first results of combined helicopter state and eye tracking data are presented. In the end we give an outlook on the next integration steps, which include the combination with a high-fidelity head tracking system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.