An evaluation of earcons was carried out to see whether they are an effective means of communicating information in sound. An initial experiment showed that earcons were better than unstructured bursts of sound and that musical timbres were more effective than simple tones. A second experiment was then carried out which improved upon some of the weaknesses shown up in Experiment 1 to give a significant improvement in recognition. From the results of these experiments some guidelines were drawn up for use in the creation of earcons. Earcons have been shown to be an effective method for communicating information in a human-computer interface.Providing information in an auditory form could generally help solve this problem and allow visually disabled users the same facilities as the sighted.This evaluation is part of a research project looking at the best ways to integrate audio and graphical interfaces. The research aims to find the areas in an interface where the use of sound will be most beneficial and also what types of sounds are the most effective for communicating information.One major question that must be answered when creating an auditory interface is: What sounds should be used? Brewster [2] outlines some of the different systems available. Gaver's auditory icons have been used in several systems, such as the SonicFinder [5], SharedARK [6] and ARKola [7]. These use environmental sounds that have a semantic link with the object they represent. They have been shown to be an effective form of presenting information in sound. One other important, and as yet untested, method of presenting auditory information is the system of earcons [1, 13, 14]. Earcons are abstract, synthetic tones that can be used in structured combinations to create sound messages to represent parts of an interface. Blattner et al. define earcons as "non-verbal audio messages that are used in the computer/user interface to provide information to the user about some computer object, operation or interaction". Earcons are composed of motives, which are short, rhythmic sequences of pitches with variable intensity, timbre and register.
This paper discusses the use of gesture and non-speech audio as ways to improve the user interface of a mobile music player. Their key advantages mean that users could use a player without having to look at its controls when on the move. Two very different evaluations of the player took place: one based on a standard usability experiment (comparing the new player to a standard design) and the other a video analysis of the player in use. Both of these showed significant usability improvements for the gesture/audio-based interface over a standard visual/penbased display. The similarities and differences in the results produced by the two studies are discussed.
This paper presents a first evaluation of multimodal language-based warnings for handovers of control in autonomous cars. A set of possible handover situations varying in urgency is described. A set of multimodal, language-based warnings for these situations is then introduced. All combinations of audio, tactile and visual warnings for handovers were evaluated in terms of perceived urgency, annoyance and alerting effectiveness. Results showed clear recognition of the warning urgency in this new context, as well as low perceived annoyance overall, and higher perceived effectiveness for critical warnings. The time of transition from self-driving to manual mode in the presence of the warnings was then evaluated. Results showed quicker transitions for highly urgent warnings and poor driving performance for unimodal visual warnings. These results provide a novel set of guidelines for an effective transition of control between car and driver in an autonomous vehicle.
In this paper we discuss the design of computer-based haptic graphs for blind and visually impaired people with the support of our preliminary experimental results. Since visual impairment makes data visualisation techniques inappropriate for blind people, we are developing a system which can make graphs accessible through haptic and audio media.The disparity between human haptic perception and the sensation simulated by force feedback devices is discussed. Our strategies to tackle technical difficulties posed by the limitations of force feedback devices are explained. Based on the results of experiments conducted on both blind and sighted people, we suggested two techniques: engraving and the use of texture to model curved lines on haptic graphs. Integration of surface property and auditory cues in our system are proposed to assist blind users in exploring haptic graphs.
Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment -making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a range of different audio designs showed that egocentric sounds reduced task completion time, perceived annoyance, and allowed users to walk closer to their preferred walking speed. The second is a sonically enhanced 2D gesture recognition system for use on a belt-mounted PDA. An evaluation of the system with and without audio feedback showed users' gestures were more accurate when dynamically guided by audio-feedback. These novel interaction techniques demonstrate effective alternatives to visual-centric interface designs on mobile devices.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.