We describe an experiment to discover if structured audio messages, earcons, could provide navigational cues in a menu hierarchy. A hierarchy of 27 nodes and four levels was created with sounds for each node. Participants had to identify their location in the hierarchy by listening to an earcon. Results showed that participants could identify their location with over 80% accuracy, indicating that earcons are a powerful method of communicating hierarchy information. Participants were also tested to see if they could identify where previously unheard earcons would fit in the hierarchy. The results showed that they could do this with over 90% accuracy. These results show that earcons are a robust and extensible method of communicating hierarchy information in sound.
Traditionally computer games are played with a keyboard and a mouse or a joystick. The playing relies mainly on visual and auditory senses. Tactile or haptic user interfaces and natural movements of the human being, e.g. running, are seldom utilised in computer games. The Lumetila project (Virtual Space -User Interfaces of the Future) aims at developing a "natural" user interface in a computer game where the user uses his body movements to control the game. To create an immersive, captivating and highly usable game, the development will be carried out in the context and practice of Human-Centred Design approach, where the computer game is designed and evaluated with end-users in every step of the iterative design process.
This work is part of the TIDE ACCESS Project 1001. The aim of this project is to create a mobile communication device for speech-motor and/or language-cognitive impaired users. People will use the device to create messages they want to communicate and then play those messages via synthetic speech. Such users often utilise pictographic languages (for example, Bliss [1]) to communicate. The pictures represent words or actions and can be combined to create complex messages. Users must be able to interact with the system as fast as possible so that they can communicate effectively. In this paper we investigate the use of non-speech sound to facilitate communication.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.