The Air Force Research Laboratory has implemented and evaluated two brain-computer interfaces (BCI's) that translate the steadystate visual evoked response into a control signal for operating a physical device or computer program. In one approach, operators self-regulate the brain response; the other approach uses multiple evoked responses.
Supervisory control of multiple unmanned aerial vehicles (UAVs) raises many questions concerning the balance of system autonomy with human interaction for effective operator situation awareness and system performance. The reported experiment used a UAV simulation environment to evaluate two applications of autonomy levels across two primary control tasks: allocation (assignment of sensor tasks to vehicles) and router (determining vehicles’ flight plans). In one application, the autonomy level was the same across these two tasks. In the other, the autonomy levels differed, one of the two tasks being more automated than the other. Trials also involved completion of other mission-related secondary tasks as participants supervised three UAVs. The results showed that performance on both the primary tasks and many secondary tasks was better when the level of automation was the same across the two sequential primary tasks. These findings suggest that having the level of automation similar across closely coupled tasks reduces mode awareness problems, which can negate the intended benefits of a fine-grained application of automation. Several research issues are identified to further explore the impact of automation-level transference in supervisory control applications involving the application of automation across numerous tasks.
Supervisory control of multiple autonomous vehicles raises many issues concerning the balance of system autonomy with human interaction for optimal operator situation awareness and system performance. An unmanned vehicle simulation designed to manipulate the application of automation was used to evaluate participants' performance on image analysis tasks under two automation control schemes: adaptable (level of automation directly manipulated by participant throughout trials) and adaptive (level of automation adapted as a function of participants' performance on four types of tasks). The results showed that while adaptable automation increased workload, it also improved change detection, as well as operator confidence in task-related decision-making.
The Air Force Research Laboratory's Human Effectiveness Directorate (AFRL/HE) supports research addressing human factors associated with Unmanned Aerial Vehicle (UAV) operator control stations. Recent research, in collaboration with Rapid Imaging Software, Inc., has focused on determining the value of combining synthetic vision data with live camera video presented on a UAV control station display. Information is constructed from databases (e.g., terrain, cultural features, pre-mission plan, etc.), as well as numerous information updates via networked communication with other sources (e.g., weather, intel). This information is overlaid conformal, in real time, onto the dynamic camera video image display presented to operators. Synthetic vision overlay technology is expected to improve operator situation awareness by highlighting key spatial information elements of interest directly onto the video image, such as threat locations, expected locations of targets, landmarks, emergency airfields, etc. Also, it may help maintain an operator's situation awareness during periods of video datalink degradation/dropout and when operating in conditions of poor visibility. Additionally, this technology may serve as an intuitive means of distributed communications between geographically separated users. This paper discusses the tailoring of synthetic overlay technology for several UAV applications. Pertinent human factors issues are detailed, as well as the usability, simulation, and flight test evaluations required to determine how best to combine synthetic visual data with live camera video presented on a ground control station display and validate that a synthetic vision system is beneficial for UAV applications. OVERVIEWUnmanned Aerial Vehicles (UAVs) are aircraft without the onboard presence of a pilot or crew. Though the physical separation of the crew from the aircraft offers many promising benefits, it also presents challenges to the effective design of the UAV control station. Numerous human factors issues such as system time delays, poor crew coordination, high workload, and reduced situational awareness may negatively affect mission performance 1 . When onboard an aircraft, a pilot and crew receive a rich supply of multi-sensory information instantaneously regarding their surrounding environment. UAV operators, however, may be limited to a time-delayed, reduced stream of sensory feedback delivered almost exclusively through the visual channel.Of all the information displays within military UAV control stations, the video imagery from various cameras mounted on the UAV is particularly valuable. UAV pilots use imagery from the nose and gimbal cameras to verify clear path for taxi/runway operations, scan for other air traffic in the area, and identify navigational landmarks and potential obstructions. Additionally, sensor operators use imagery from a gimbal-mounted camera to conduct a wide variety of intelligence, surveillance and reconnaissance activities as well as to directly support combat operations. However, video...
This simulation study investigated factors influencing sustained performance and fatigue during operation of multiple Unmanned Aerial Systems (UAS). The study tested effects of time-on-task and automation reliability on accuracy in surveillance tasks and dependence on automation. It also investigated the role of trait and state individual difference factors. Background: Warm's resource model of vigilance has been highly influential in human factors, but further tests of its applicability to complex, real-world tasks requiring sustained attention are necessary. Multi-UAS operation differs from standard vigilance paradigms in that the operator must switch attention between multiple subtasks, with support from automation. Method: 131 participants performed surveillance tasks requiring signal discrimination and symbol counting with a multi-UAS simulation configured to impose low cognitive demands, for 2 hr. Automation reliability was manipulated between-groups. Five Factor Model personality traits were measured prior to performance. Subjective states were assessed with the Dundee Stress State Questionnaire. Results: Performance accuracy on the more demanding surveillance task showed a vigilance decrement, especially when automation reliability was low. Dependence on automation on this task declined over time. State but not trait factors predicted performance. High distress was associated with poorer performance in more demanding task conditions. Conclusions: Vigilance decrement may be an operational issue for multi-UAS surveillance missions. Warm's resource theory may require modification to incorporate changes in information processing and task strategy associated with multitasking in low-workload, fatiguing environments. Application: Interface design and operator evaluation for multi-UAS operations should address issues including motivation, stress, and sustaining attention to automation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.