Proceedings of the Second Annual ACM Conference on Assistive Technologies - Assets '96 1996
DOI: 10.1145/228347.228366
|View full text |Cite
|
Sign up to set email alerts
|

A generic direct-manipulation 3D-auditory environment for hierarchical navigation in non-visual interaction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
30
0

Year Published

1999
1999
2020
2020

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 43 publications
(30 citation statements)
references
References 3 publications
0
30
0
Order By: Relevance
“…Finally, Savidis et al [13] used a non-visual 3D audio environment to allow blind users to interact with standard GUIs. Different menu items were mapped to different locations around the user's head.…”
Section: Previous Work On Auditory and Gestural Interfaces For Mobilementioning
confidence: 99%
“…Finally, Savidis et al [13] used a non-visual 3D audio environment to allow blind users to interact with standard GUIs. Different menu items were mapped to different locations around the user's head.…”
Section: Previous Work On Auditory and Gestural Interfaces For Mobilementioning
confidence: 99%
“…This is true not only for the real world but also in virtual computer based environments. Certain evidence of this is given by experiences done according to the HOMER UIMS approach by Savidis and Stephanidis [23]; this approach consists of developing dual user interfaces for integrating blind and sighted people. To achieve this goal standard visualization elements like control element icons, tool menus, short cuts, logical structures with nodes and links, hypertext, images and animated sequences are enriched with acoustic elements or haptic interfaces, which allow direct interaction of the user with objects of the model used for the computer to represent the problem being explained or presented.…”
Section: Parallel Reception Modesmentioning
confidence: 99%
“…The HOMER UIMS approach by Anthony Savidis and Constantine Stephanidis [22,23] develops dual user interfaces for the integration of blind and sighted. HOMER supports the integration of visual and non-visual interaction objects and their relationships.…”
Section: Related Tools For Blind Usersmentioning
confidence: 99%
“…Our approach tries to extend these concepts by testing the hypothesis that a 3D sound navigable environment can create some mental images and serve as an aural representation of the space and surrounding entities such as the ones explored in previous studies [13,15,16,20]. Kobayashi [11] explores the idea of sound-space association to enable simultaneous speaker listening, but spatial navigation is not included.…”
Section: Introductionmentioning
confidence: 99%