2019
DOI: 10.1038/s41598-019-44267-3
|View full text |Cite
|
Sign up to set email alerts
|

Interactions between egocentric and allocentric spatial coding of sounds revealed by a multisensory learning paradigm

Abstract: Although sound position is initially head-centred (egocentric coordinates), our brain can also represent sounds relative to one another (allocentric coordinates). Whether reference frames for spatial hearing are independent or interact remained largely unexplored. Here we developed a new allocentric spatial-hearing training and tested whether it can improve egocentric sound-localisation performance in normal-hearing adults listening with one ear plugged. Two groups of participants (N = 15 each) performed an eg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
27
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 16 publications
(28 citation statements)
references
References 59 publications
1
27
0
Order By: Relevance
“…In addition, because physical sound sources are present, these approaches face the problem of controlling the contribution of visual cues to sound localization, as any visual prior will contribute to the interpretation of auditory cues (Jackson & Morton 1984;Makous & Middlebrooks 1990). Participants are therefore blind-folded, sometimes from the moment they enter the experimental room (e.g., Ahrens and al., 2019), or are instructed to close their eyes at specific moments during the task (e.g., , or face speakers hidden behind a fabric panel of some sort (e.g., Rabini et al, 2019).…”
Section: Sound Localization In 3dmentioning
confidence: 99%
“…In addition, because physical sound sources are present, these approaches face the problem of controlling the contribution of visual cues to sound localization, as any visual prior will contribute to the interpretation of auditory cues (Jackson & Morton 1984;Makous & Middlebrooks 1990). Participants are therefore blind-folded, sometimes from the moment they enter the experimental room (e.g., Ahrens and al., 2019), or are instructed to close their eyes at specific moments during the task (e.g., , or face speakers hidden behind a fabric panel of some sort (e.g., Rabini et al, 2019).…”
Section: Sound Localization In 3dmentioning
confidence: 99%
“…Finally, following previous works (Rabini et al, 2019;Steadman et al, 2019;Valzolgher, Campus, Rabini, Gori & Pavani, under review) we examined whether re-learning across blocks relates to the magnitude of the immediate ear-plug effect. To this aim, we correlated across participants the ear-plug effect (measured as the difference in absolute localisation error between B and M1; positive values indicate larger plug-effect) and re-learning (measured as the difference in absolute localisation error between M1 and M4; positive values indicate larger re-learning).…”
Section: Re-learningmentioning
confidence: 99%
“…In addition, they change on a daily basis, due to the diversity of listening contexts to which we are all exposed (Majdak, Goupell & Laback, 2010). To cope with these continuous changes in auditory cues, the brain remains capable of updating sound-space correspondences throughout life (e.g., Carlile, Balachandar & Kelly 2014;Keating & King, 2015;Rabini, Altobelli & Pavani, 2019;Strelnikov, Rosito & Barone, 2011;Van Wanrooij, John & Opstal, 2005). In the present work, we examined reaching to sounds in virtual reality as a multisensory-motor strategy for re-learning sound localisation in adulthood.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations