Recent achievements in hearing aid development, such as visually guided hearing aids, make it increasingly important to study movement behavior in everyday situations in order to develop test methods and evaluate hearing aid performance. In this work, audiovisual virtual environments (VEs) were designed for communication conditions in a living room, a lecture hall, a cafeteria, a train station, and a street environment. Movement behavior (head movement, gaze direction, and torso rotation) and electroencephalography signals were measured in these VEs in the laboratory for 22 younger normal-hearing participants and 19 older normal-hearing participants. These data establish a reference for future studies that will investigate the movement behavior of hearing-impaired listeners and hearing aid users for comparison. Questionnaires were used to evaluate the subjective experience in the VEs. A test–retest comparison showed that the measured movement behavior is reproducible and that the measures of movement behavior used in this study are reliable. Moreover, evaluation of the questionnaires indicated that the VEs are sufficiently realistic. The participants rated the experienced acoustic realism of the VEs positively, and although the rating of the experienced visual realism was lower, the participants felt to some extent present and involved in the VEs. Analysis of the movement data showed that movement behavior depends on the VE and the age of the subject and is predictable in multitalker conversations and for moving distractors. The VEs and a database of the collected data are publicly available.
Recent studies of hearing aid benefits indicate that head movement behavior influences performance. To systematically assess these effects, movement behavior must be measured in realistic communication conditions. For this, the use of virtual audiovisual environments with animated characters as visual stimuli has been proposed. It is unclear, however, how these animations influence the head-and eye-movement behavior of subjects. Here, two listening tasks were carried out with a group of 14 young normal hearing subjects to investigate the influence of visual cues on head-and eye-movement behavior; on combined localization and speech intelligibility task performance; as well as on perceived speech intelligibility, perceived listening effort and the general impression of the audiovisual environments. Animated characters with different lip-syncing and gaze patterns were compared to an audio-only condition and to a video of real persons. Results show that movement behavior, task performance, and perception were all influenced by visual cues. The movement behavior of young normal hearing listeners in animation conditions with lip-syncing was similar to that in the video condition. These results in young normal hearing listeners are a first step towards using the animated characters to assess the influence of head movement behavior on hearing aid performance.
Head movements can improve sound localization performance and speech intelligibility in acoustic environments with spatially distributed sources. However, they can affect the performance of hearing aid algorithms, when adaptive algorithms have to adjust to changes in the acoustic scene caused by head movement (the so-called maladaptation effect) or when directional algorithms are not facing in the optimal direction because the head has moved away (the so-called misalignment effect). In this article, we investigated the mechanisms behind these maladaptation and misalignment effects for a set of six standard hearing aid algorithms using acoustic simulations based on premade databases; this was done so we could study the effects as carefully as possible. Experiment 1 investigated the maladaptation effect by analyzing hearing aid benefit after simulated rotational head movement in simple anechoic noise scenarios. The effects of movement parameters (start angle and peak velocity), noise scenario complexity, and adaptation time were studied, as well as the recovery time of the algorithms. However, a significant maladaptation effect was only found in the most unrealistic anechoic scenario with one noise source. Experiment 2 investigated the effects of maladaptation and misalignment using previously recorded natural head movements in acoustic scenes resembling everyday life situations. In line with the results of Experiment 1, no effect of maladaptation was found in these more realistic acoustic scenes. However, a significant effect of misalignment on the performance of directional algorithms was found. This demonstrates the need to take head movement into account in the evaluation of directional hearing aid algorithms.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.