2022 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) 2022
DOI: 10.1109/vr51125.2022.00042
|View full text |Cite
|
Sign up to set email alerts
|

Empirical Evaluation of Calibration and Long-term Carryover Effects of Reverberation on Egocentric Auditory Depth Perception in VR

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 43 publications
0
3
0
Order By: Relevance
“…For example, the visual content affects the perceived audio quality [18], sound localisation (ventriloquism effect) [19][20][21][22] and distance perception of audio sources [23][24][25]. Sound localisation benefits from the presence of a visual-spatial frame [26], and visual distance perception benefits from the presence of reverberation [27]. However, the perception of reverberation may not be affected by visual room impressions in virtual environments [28].…”
Section: Visual Impact On the Perception Of Spatial Audiomentioning
confidence: 99%
“…For example, the visual content affects the perceived audio quality [18], sound localisation (ventriloquism effect) [19][20][21][22] and distance perception of audio sources [23][24][25]. Sound localisation benefits from the presence of a visual-spatial frame [26], and visual distance perception benefits from the presence of reverberation [27]. However, the perception of reverberation may not be affected by visual room impressions in virtual environments [28].…”
Section: Visual Impact On the Perception Of Spatial Audiomentioning
confidence: 99%
“…Then, [52] used HMDs to investigate the causal agents of head trauma in athletes. Also, in [53], the authors used HMDs to compare user preferences for different virtual navigation instructions that integrate with the real environment in the MR world. Lastly, [54] built a custom HMD to evaluate performance and eye fatigue for context and focal switching in AR.…”
Section: Head-mounted Devicesmentioning
confidence: 99%
“…This classification is based on whether the experiment requires any type of response or input behavior from participants. Examples of users' feedback include asking participants to report a target's color [61], [67], to fill in a questionnaire after the experiment is over [6], [68], or automatically logged responses, such as time spent on achieving pre-set goals [53], [69], or accuracy in putting virtual objects in the correct position in the environment [70], [71]. Studies that incorporate users' responses are the most common category among the types of measurement, likely because capturing direct feedback from users is a quick and easy way to obtain experimental data.…”
Section: Categorization: Environmental Factors User or Device Measure...mentioning
confidence: 99%