Search citation statements
Paper Sections
Citation Types
Year Published
Publication Types
Relationship
Authors
Journals
Purpose: Prior research extensively documented challenges in recognizing verbal and nonverbal emotion among older individuals when compared with younger counterparts. However, the nature of these age-related changes remains unclear. The present study investigated how older and younger adults comprehend four basic emotions (i.e., anger, happiness, neutrality, and sadness) conveyed through verbal (semantic) and nonverbal (facial and prosodic) channels. Method: A total of 73 older adults (43 women, M age = 70.18 years) and 74 younger adults (37 women, M age = 22.01 years) partook in a fixed-choice test for recognizing emotions presented visually via facial expressions or auditorily through prosody or semantics. Results: The results confirmed age-related decline in recognizing emotions across all channels except for identifying happy facial expressions. Furthermore, the two age groups demonstrated both commonalities and disparities in their inclinations toward specific channels. While both groups displayed a shared dominance of visual facial cues over auditory emotional signals, older adults indicated a preference for semantics, whereas younger adults displayed a preference for prosody in auditory emotion perception. Notably, the dominance effects observed in older adults for visual and semantic cues were less pronounced for sadness and anger compared to other emotions. These challenges in emotion recognition and the shifts in channel preferences among older adults were correlated with their general cognitive capabilities. Conclusion: Together, the findings underscore that age-related obstacles in perceiving emotions and alterations in channel dominance, which vary by emotional category, are significantly intertwined with overall cognitive functioning. Supplemental Material: https://doi.org/10.23641/asha.27307251
Purpose: Prior research extensively documented challenges in recognizing verbal and nonverbal emotion among older individuals when compared with younger counterparts. However, the nature of these age-related changes remains unclear. The present study investigated how older and younger adults comprehend four basic emotions (i.e., anger, happiness, neutrality, and sadness) conveyed through verbal (semantic) and nonverbal (facial and prosodic) channels. Method: A total of 73 older adults (43 women, M age = 70.18 years) and 74 younger adults (37 women, M age = 22.01 years) partook in a fixed-choice test for recognizing emotions presented visually via facial expressions or auditorily through prosody or semantics. Results: The results confirmed age-related decline in recognizing emotions across all channels except for identifying happy facial expressions. Furthermore, the two age groups demonstrated both commonalities and disparities in their inclinations toward specific channels. While both groups displayed a shared dominance of visual facial cues over auditory emotional signals, older adults indicated a preference for semantics, whereas younger adults displayed a preference for prosody in auditory emotion perception. Notably, the dominance effects observed in older adults for visual and semantic cues were less pronounced for sadness and anger compared to other emotions. These challenges in emotion recognition and the shifts in channel preferences among older adults were correlated with their general cognitive capabilities. Conclusion: Together, the findings underscore that age-related obstacles in perceiving emotions and alterations in channel dominance, which vary by emotional category, are significantly intertwined with overall cognitive functioning. Supplemental Material: https://doi.org/10.23641/asha.27307251
Background/Objectives: Emotional prosody, the intonation and rhythm of speech that conveys emotions, is vital for speech communication as it provides essential context and nuance to the words being spoken. This study explored how listeners automatically process emotional prosody in speech, focusing on different neural responses for the prosodic categories and potential sex differences. Methods: The pilot data here involved 11 male and 11 female adult participants (age range: 18–28). A multi-feature oddball paradigm was used, in which participants were exposed to sequences of non-repeating English words with emotional (angry, happy, sad) or neutral prosody while watching a silent movie. Results: Both mismatch negativity (MMN) and P3a components were observed, indicating automatic perceptual grouping and neural sensitivity to emotional variations in speech. Women showed stronger MMN to angry than sad prosody, while men showed stronger MMN to angry than happy prosody. Happy prosody elicited the strongest P3a, but only in men. Conclusions: The findings challenge the notion that all facets of emotion processing are biased toward female superiority. However, these results from 22 young adult native English speakers should be interpreted with caution, as data from a more adequate sample size are needed to test the generalizability of the findings. Combined with results from studies on children and elderly adults, these preliminary data underscore the need to explore the complexities of emotional speech processing mechanisms to account for category and sex differences across the lifespan in a longitudinal perspective.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.