2017
DOI: 10.1121/1.4988751
|View full text |Cite
|
Sign up to set email alerts
|

Using visual cues to perceptually extract sonified data in collaborative, immersive big-data display systems

Abstract: Recently, multi-modal presentation systems have gained much interest to study big data with interactive user groups. One of the problems of these systems is to provide a venue for both personalized and shared information. Especially, sound fields containing parallel audio streams can distract users from extracting necessary information. The way spatial information is processed in the brain allows humans to take complicated visuals and focus on details or the whole. However, temporal information, which can be b… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
2
1
1

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Visual information extraction in the immersant space has been used to facilitate perceptual studies on other senses, such as auditory cue retrieval (Lee, Chabot, and Braasch 2017) and the effects of human-scale presentation of architectural renderings on design judgments (Elder 2017). Results in the former experiment discuss the importance of auditory localization on speed and accuracy of extraction, while results in the latter indicate discrepancies in judgments between traditional architectural and humanscale presentations of designs (Chabot et al 2018).…”
Section: Extended Realities With Panoramic Imagerymentioning
confidence: 99%
“…Visual information extraction in the immersant space has been used to facilitate perceptual studies on other senses, such as auditory cue retrieval (Lee, Chabot, and Braasch 2017) and the effects of human-scale presentation of architectural renderings on design judgments (Elder 2017). Results in the former experiment discuss the importance of auditory localization on speed and accuracy of extraction, while results in the latter indicate discrepancies in judgments between traditional architectural and humanscale presentations of designs (Chabot et al 2018).…”
Section: Extended Realities With Panoramic Imagerymentioning
confidence: 99%
“…W Lee, S Chabot and J Braasch put forward that researches in the field of big data mainly focus on visual representation and information extraction, and pay little attention to sound. The aim of this study was to evaluate the most efficient method of extracting visual data using auditory stimulus perception in an immersive environment [1]. Muslem A, Abbas M explored the effects of peer supported immersive multimedia learning strategies on reading and oral production skills.…”
Section: Introductionmentioning
confidence: 99%