The following paper provides an assessment of how selected properties of a virtual reality system impact size-distance judgments in a virtual environment. Manipulations are made in viewing conditions (biocular vs. stereoscopic), image resolution, field of view, scene contrast, and target distance, while subjects attempt to match the attributes of a comparison object with a standard object. General findings suggest that under more natural viewing conditions, size-distance judgments in virtual environments differ from those found previously in physical environments; whereas, under impoverished conditions, performance differences between the two environments are similar.systems. Each nomograph produces a design trade-off space that reveals ways to make a VR system which achieves a predicted level of human performance in the virtual environment. One aim of our group is to produce design trade-off nomographs of this nature.
Natural aural directional cueing in the cockpit should relieve the demands placed on the visual modality, reduce display clutter and alleviate cognitive attention needed to process and extract meaning from coded formats. This experiment compared the effectiveness of three-dimensional (3-D) auditory cues to conventional visual and auditory methods of directing visual attention to peripheral targets. Five directional cues were evaluated: visual symbol, coded aural tone, speech cue, 3-D tone (white noise appearing to emanate from peripheral locations) and 3-D speech (speech cue appearing to emanate from peripheral locations). The results showed significant performance differences as a function of directional cue type in peripheral target task completion time, as well as eye and head reaction time. Results, such as these, will help improve the application of directional sound in operational cockpits.
The purpose of this study was to evaluate the impact of two different types of input devices on psychomotor performance in a “full” 3D (volumetric) virtual environment. Three subjects used both a direct input device (hand-held stylus) and an indirect input device (6 degree of freedom, fixed mounted force device) to accomplish aimed movement to a target located in one of eight depth planes. Directness was characterized in terms of action space and perception space coincidence and natural kinematic arm-hand movements. Different instructions were used to place different demands on coordination of movement. Aim point performance was evaluated in terms of aiming speed and accuracy, and steadiness at the terminal point. A descriptive analysis showed consistently better performance on all measures with the direct device. A statistical analysis confirmed these trends, although significance often was not always achieved due to limited experimental power. In general, the results suggest an advantage for the direct input device in the range of 30% to 73% across target locations, depending on type of performance measured. The data are discussed in terms of action and perception space coincidence and the coordination of the multiple degrees of freedom property of external input devices.
Eye-controlled switching has been proposed as a biocybernetic control approach which may increase system effectiveness while reducing pilot workload. In this experiment, six subjects selected discrete switches on the front panel of a cockpit simulator while manually tracking a target. In two eye-controlled methods, the subjects directed their gaze at the switch indicated by an auditory cue and then made a consent input (either a manual response or a verbal response). In a conventional manual condition, subjects selected the switches with their left hand. The analysis of mean switching time suggests that eye control is a feasible alternative when hands-off control is desired. Tracking performance was found to differ significantly among switching conditions, indicating the importance of quantifying the efficiency of candidate control methods in visual workload environments analogous to that of the application environment.
While the nature of eye and head displacements t o target acquisitions i n the horizontal plane have been frequently studied, such investigations i n the vertical plane are somewhat scarce. In the experiment reported herein the f i n a l displacements of the head, eye, and gaze were examined for target acqdsitions i n the vertical and horizontal planes. The subjects' task was t o f i x a t e on a central target until receiving a verbal command to fixate on one of four peripheral targets. The analysis of the m e a n head, eye, and gaze displacement data t o the target vertical and horizontal planes. and MacAulay-Brown, Inc. (MacB) portions of t h i s effort were performed in support of the USAF AAMRL Contract Numbers F33615-85-C-0541, and F33615-82-C-0513, respectively.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.