Objective: Considering the impact of listening effort (LE) on auditory perception, attention, and memory, it is a significant aspect in the daily hearing experiences of cochlear implant (CI) recipients. Reduced spectral and temporal information on an acoustic signal can make listening more difficult; as a result, it is important to understand the relationship between LE and spectral and temporal auditory processing capacities in CI receivers. Study Design, Setting, and Patients: This study used spectral ripple discrimination and temporal modulation transfer function to evaluate 20 prelingually deafened and early implanted CI recipients. The speech perception in noise test (primary) and the digit recall task (DRT-secondary) were used to assess LE using the dual-task paradigm. To assess the effects of acoustic hearing, contralateral acoustic hearing thresholds between 125 Hz and 8 kHz with a hearing aid were also acquired. To examine the relationship between the research variables, correlation coefficients were generated. Furthermore, the Mann-Whitney U test was used to compare unilateral and bimodal users.Results: There was a statistically significant correlation between LE and spectral ripple discrimination (r = 0.56; p = 0.011), 125 Hz (r = 0.51; p = 0.020), 250 Hz (r = 0.48; p = 0.030), 500 Hz (r = 0.45; p = 0.045), 1,000 Hz (r = 0.51; p = 0.023), 2000 Hz (r = 0.48; p = 0.031), and 4,000 Hz (r = 0.48; p = 0.031), whereas no statistically significant correlations were observed between temporal modulation transfer function in four frequencies and LE. There was no statistically significant difference between unilateral and bimodal CI recipients ( p > 0.05). Conclusion: As a result of the improved signal-to-noise ratio in the auditory environment, CI users with better spectral resolutions and acoustic hearing have a reduced LE. On the other hand, temporal auditory processing, as measured by temporal modulation detection, does not contribute to the LE.
<b><i>Objective:</i></b> Emotions are often conveyed via visual and together with the auditory mode in social interaction. We aimed to investigate the ability to recognize facial and/or auditory emotions in school-aged children with cochlear implantation and healthy controls. <b><i>Methods:</i></b> All participants were asked to respond to facial emotions of Ekman and Friesen’s pictures, then auditory emotions, and last, they were asked to respond to video-based dynamic synchronous facial and auditory emotions. <b><i>Results:</i></b> The mean accuracy rates in recognizing anger (<i>p</i> = 0.025), surprise (<i>p</i> = 0.029), and neutral (<i>p</i> = 0.029) faces were significantly worse in children with cochlear implants (CIs) than in healthy controls. They were significantly worse than healthy controls in recognizing all auditory emotions except auditory emotion of fear (<i>p</i> = 0.067). The mean accuracy rates in recognizing video-based auditory/facial emotions of surprise (<i>p</i> = 0.031) and neutral (<i>p</i> = 0.029) emotions were significantly worse in children with CIs. <b><i>Conclusion:</i></b> The children with hearing loss were poorer in recognizing surprise, anger, and neutral facial emotions than healthy children; they had similar performance in recognizing anger emotions when both stimuli were given synchronously which may have a positive effect on social behaviors. It seems beneficial that emotion recognition training should be included in rehabilitation programs.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.