Although falling is a significant problem for older persons, little is understood about its underlying causes. Spatial cognition and balance maintenance rely on the efficient integration of information across the main senses. We investigated general multisensory efficiency in older persons with a history of falls compared to age- and sensory acuity-matched controls and younger adults using a sound-induced flash illusion. Older fallers were as susceptible to the illusion as age-matched, non-fallers or younger adults at a short delay of 70 ms between the auditory and visual stimuli. Both older adult groups were more susceptible to the illusion at longer SOAs than younger adults. However, with increasing delays between the visual and auditory stimuli, older fallers did not show a decline in the frequency at which the illusion was experienced even with delays of up to 270 ms. We argue that this relatively higher susceptibility to the illusion reflects inefficient audio-visual processing in the central nervous system and has important implications for the diagnosis and rehabilitation of falling in older persons.
A common practice in youth rugby union is to group players based on (bi)annual age with fixed cut-off dates. The overrepresentation of players born at the start of the cut-off date and the underrepresentation of players born toward the end of the cut-off date are termed relative age effects (RAEs). The aim of this study was to examine RAEs during entry into professional and international rugby union pathways in England, as well as comparing them to their respective senior cohort: U15 Regional Academy Player (n = 1,114) vs. Senior Professional Player (n = 281) and U16–23 England Academy Player (n = 849) vs. Senior International Player (n = 48). Chi-square (χ2) analysis compared birth quarter (BQ) distributions against expected distributions. Odds ratios and 95% confidence intervals compared the likelihood of a BQ being selected. Findings revealed a significant overrepresentation of relatively older players compared with their relatively younger peers within both youth cohorts (P < 0.001; BQ1 = 42.5% vs. BQ4 = 9.6%; BQ1 = 36.5% vs. BQ4 = 15.2%). In comparison, there was no significant difference in the BQ distributions within both senior cohorts. Further, BQ4s were 3.86 and 3.9 times more likely to achieve senior professional and international levels than BQ1s and BQ2s, respectively. It is suggested that relatively younger players may have a greater likelihood of achieving expertise following entry into a rugby union talent pathway due to benefitting from more competitive play against relatively older counterparts during their development (e.g., reversal effects; the underdog hypothesis). Moreover, possible solutions (e.g., age and anthropometric banding; playing-up and playing-down) are discussed to encourage practitioners and policy makers to create the most appropriate learning environment for every player.
Recent studies suggest that multisensory integration is enhanced in older adults but it is not known whether this enhancement is solely driven by perceptual processes or affected by cognitive processes. Using the “McGurk illusion,” in Experiment 1 we found that audio-visual integration of incongruent audio-visual words was higher in older adults than in younger adults, although the recognition of either audio- or visual-only presented words was the same across groups. In Experiment 2 we tested recall of sentences within which an incongruent audio-visual speech word was embedded. The overall semantic meaning of the sentence was compatible with either one of the unisensory components of the target word and/or with the illusory percept. Older participants recalled more illusory audio-visual words in sentences than younger adults, however, there was no differential effect of word compatibility on recall for the two groups. Our findings suggest that the relatively high susceptibility to the audio-visual speech illusion in older participants is due more to perceptual than cognitive processing.
Self-rated health is not simply the absence of physical illness but is also strongly influenced by the degree of functional impairment, personality factors and the level of stress experienced by the older person.
Previous studies have found that perception in older people benefits from multisensory over unisensory information. As normal speech recognition is affected by both the auditory input and the visual lip movements of the speaker, we investigated the efficiency of audio and visual integration in an older population by manipulating the relative reliability of the auditory and visual information in speech. We also investigated the role of the semantic context of the sentence to assess whether audio–visual integration is affected by top-down semantic processing. We presented participants with audio–visual sentences in which the visual component was either blurred or not blurred. We found that there was a greater cost in recall performance for semantically meaningless speech in the audio–visual ‘blur’ compared to audio–visual ‘no blur’ condition and this effect was specific to the older group. Our findings have implications for understanding how aging affects efficient multisensory integration for the perception of speech and suggests that multisensory inputs may benefit speech perception in older adults when the semantic content of the speech is unpredictable.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.