2009
DOI: 10.1080/14639220701734455
|View full text |Cite
|
Sign up to set email alerts
|

Theoretical foundations for integrating sound in interactive interfaces: identifying temporal and spatial information conveyance principles

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 47 publications
0
7
0
Order By: Relevance
“…The inability to accurately differentiate between sound sources can lead to increases in cognitive workload and decreases in effectiveness of task performance. Sound source localization in humans is typically described as a coordinate point with characteristic azimuth, elevation and distance relative to the listeners head position in space [2]. Horizontal plane localization refers to the azimuth and distance components of the sound source location coordinates when the elevation of the source is equal to the head height.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…The inability to accurately differentiate between sound sources can lead to increases in cognitive workload and decreases in effectiveness of task performance. Sound source localization in humans is typically described as a coordinate point with characteristic azimuth, elevation and distance relative to the listeners head position in space [2]. Horizontal plane localization refers to the azimuth and distance components of the sound source location coordinates when the elevation of the source is equal to the head height.…”
Section: Introductionmentioning
confidence: 99%
“…The cues that are typically used to predict the sound source distance include the sound intensity, the frequency content and the decay characteristics of the sound [1]. Of these cues, sound intensity is the primary cue for distance estimation [2]. There is strong evidence that when a sound source is relatively distant from the listener, the auditory cues that allow a listener to localize the sound are largely independent of distance [3,7].…”
Section: Introductionmentioning
confidence: 99%
“…Any data group can be mapped to a sound group of type both. We based this decision on work done by Ahmad et al [1], who proposed that instant-based temporal information (in our case 'instant' sounds) are used to specify a point in time, where as interval-based temporal information ('interval' sounds) are used to indicate status or progress.…”
Section: Policymentioning
confidence: 99%
“…Utilizing current auditory display design principles based on perceptual and theoretical findings covered in Bregman's (1990) Auditory Scene Analysis and theoretical models that focus on integrating sound in interactive interfaces (Ahmad, Stanney, & Fouad, 2009), our aim was to develop an auditory display that improves head-up monitoring of an aircraft's position relative to a flight plan. We employed the audio integration, temporal audio, and spatial audio theoretical models defined by Ahmad et al (2009) as guidelines for determining the cognitive performance objectives and subsequent acoustic wave attributes for each auditory cue.…”
Section: Introductionmentioning
confidence: 99%
“…We employed the audio integration, temporal audio, and spatial audio theoretical models defined by Ahmad et al (2009) as guidelines for determining the cognitive performance objectives and subsequent acoustic wave attributes for each auditory cue. Auditory cues were used to guide psychomotor activity when adjustments were being made to the aircraft's control yoke to correct and maintain accurate flight navigation.…”
Section: Introductionmentioning
confidence: 99%