Proceedings of the SIGCHI Conference on Human Factors in Computing Systems 2003
DOI: 10.1145/642611.642694
|View full text |Cite
|
Sign up to set email alerts
|

Multimodal 'eyes-free' interaction techniques for wearable devices

Abstract: Mobile and wearable computers present input/output problems due to limited screen space and interaction techniques. When mobile, users typically focus their visual attention on navigating their environment -making visually demanding interface designs hard to operate. This paper presents two multimodal interaction techniques designed to overcome these problems and allow truly mobile, 'eyes-free' device use. The first is a 3D audio radial pie menu that uses head gestures for selecting items. An evaluation of a r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
67
0

Year Published

2004
2004
2020
2020

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 187 publications
(70 citation statements)
references
References 10 publications
1
67
0
Order By: Relevance
“…By turning a knob and giving voice commands the user can interact with the system. Brewster et al (2003) created a mobile system based on Audio Windows by Cohen and Ludwig (1991). They used spatialized auditory icons localized in the horizontal plain either around or in front of the user's head.…”
Section: Related Workmentioning
confidence: 99%
“…By turning a knob and giving voice commands the user can interact with the system. Brewster et al (2003) created a mobile system based on Audio Windows by Cohen and Ludwig (1991). They used spatialized auditory icons localized in the horizontal plain either around or in front of the user's head.…”
Section: Related Workmentioning
confidence: 99%
“…Brewster et al developed an eyes-free gesture-audio interface for wearable devices [8]. The interface receives input from the user's head movements and hand gestures, and outputs 3D audio through a headphone.…”
Section: Innovative Interaction Mechanismsmentioning
confidence: 99%
“…First, the prevalent use of gestures in human communications makes intuitive gesture input on computers possible. Gesture input commands can be designed so that traditional gesture meanings in human communications are carried on (e.g., [8]). Second, this type of input can provide advantages similar to those of direct manipulation on GUIs, and can reduce the memory demand for recalling commands (e.g., [15,31]).…”
Section: Innovative Interaction Mechanismsmentioning
confidence: 99%
“…Consistent with the underlying motivation for the current research, it has been observed that mobile devices must tolerate a broad range of external conditions and technology must be adaptable to changing external conditions (Satyanarayanan, 1996). Several studies have explicitly examined interaction solutions for mobile users such as systems that support eyes-free interactions by avoiding the standard graphical user interface found on most desktop systems (Brewster, Lumsden, Bell, Hall, & Tasker, 2003), a minimal attention user interface designed specifically for field workers (Pascoe, Ryan, & Morse, 2000), and speech and audio-based interactions with mobile devices (Sawhney & Schmandt, 2000). Of importance, mobile data entry and multiple demands for visual attention are often mentioned as key challenges that must be addressed as researchers seek to develop new solutions, which will allow for the more effective use of highly portable computing devices.…”
Section: Literature Reviewmentioning
confidence: 99%