Proceedings of the Second Annual ACM Conference on Assistive Technologies - Assets '96 1996
DOI: 10.1145/228347.228350
|View full text |Cite
|
Sign up to set email alerts
|

Enhancing scanning input with non-speech sounds

Abstract: This work is part of the TIDE ACCESS Project 1001. The aim of this project is to create a mobile communication device for speech-motor and/or language-cognitive impaired users. People will use the device to create messages they want to communicate and then play those messages via synthetic speech. Such users often utilise pictographic languages (for example, Bliss [1]) to communicate. The pictures represent words or actions and can be combined to create complex messages. Users must be able to interact with the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2002
2002
2023
2023

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 10 publications
0
9
0
Order By: Relevance
“…For selecting an object, the user has to press a switch at the right time. Brewster et al [1996] showed that added auditory feedback supported the scanning rhythm and helped the users in predicting the right time for pressing the switch for selection, thus improving the performance. Seifert [2002] studied feedback in gaze interaction by comparing 1) continuously shown gaze cursor, 2) discrete feedback with highlighting the target under focus, and 3) no visible feedback for the gaze position.…”
Section: Related Workmentioning
confidence: 99%
“…For selecting an object, the user has to press a switch at the right time. Brewster et al [1996] showed that added auditory feedback supported the scanning rhythm and helped the users in predicting the right time for pressing the switch for selection, thus improving the performance. Seifert [2002] studied feedback in gaze interaction by comparing 1) continuously shown gaze cursor, 2) discrete feedback with highlighting the target under focus, and 3) no visible feedback for the gaze position.…”
Section: Related Workmentioning
confidence: 99%
“…Speech includes recorded voice or synthetic voices generated with text-to-speech applications. Associating non-speech sounds (auditory icons and earcons) with events has been shown to improve the interaction of graphical human-computer interfaces (Gaver et al, 1991;Blattener et al, 1992;DiGiano et al, 1993;Brewster et al, 1996). Auditory icons are everyday sounds can intuitively be associated with system events (Gaver et al, 1991;Gaver, 1993a,b), while earcons are abstract, musical tones that can be used in structured combinations to indicate audio messages (Blattner et al, 1989;Brewster, 1998).…”
Section: Introductionmentioning
confidence: 99%
“…In [2], the block-row-column based scanning is augmented with auditory signals based on the fact that temporal perception is more sensitive to auditory input than visual input. Although preliminary results show that children enjoy auditory input, quantitatively it is not proved to enhance the performance.…”
Section: Related Workmentioning
confidence: 99%