Auditory perceptual simulation (APS) during silent reading refers to situations in which the reader actively simulates the voice of a character or other person depicted in a text. In three eye-tracking experiments, APS effects were investigated as people read utterances attributed to a native English speaker, a non-native English speaker, or no speaker at all. APS effects were measured via online eye movements and offline comprehension probes. Results demonstrated that inducing APS during silent reading resulted in observable differences in reading speed when readers simulated the speech of faster compared to slower speakers and compared to silent reading without APS. Social attitude survey results indicated that readers' attitudes towards the native and non-native speech did not consistently influence APS-related effects. APS of both native speech and non-native speech increased reading speed, facilitated deeper, less good-enough sentence processing, and improved comprehension compared to normal silent reading.
Translational applications of cognitive science depend on having predictive models at the individual, or idiographic, level. However, idiographic model parameters, such as working memory capacity, often need to be estimated from specific tasks, making them dependent on task-specific assumptions. Here, we explore the possibility that idiographic parameters reflect an individual's biology and can be identified from task-free neuroimaging measures. To test this hypothesis, we correlated a reliable behavioral trait, the individual rate of forgetting in long-term memory, with a readily available task-free neuroimaging measure, the resting-state EEG spectrum. Using an established, adaptive fact-learning procedure, the rate of forgetting for verbal and visual materials was measured in a sample of 50 undergraduates from whom we also collected eyes-closed resting-state EEG data. Statistical analyses revealed that the individual rates of forgetting were significantly correlated across verbal and visual materials. Importantly, both rates correlated with resting-state power levels low (13-15 Hz) and upper (15-17 Hz) portion of the beta frequency bands. These correlations were particularly strong for visuospatial materials, were distributed over multiple fronto-parietal locations, and remained significant even after a correction for multiple comparisons (False Discovery Rate) and robust correlations methods were applied. These results suggest that computational models could be individually tailored for prediction using idiographic parameter values derived from inexpensive, task-free imaging recordings..
Translational applications of cognitive science depend on having predictive models at the individual, or idiographic, level. However, idiographic model parameters, such as working memory capacity, often need to be estimated from specific tasks, making them dependent on task-specific assumptions. Here, we explore the possibility that idiographic parameters reflect an individual's biology and can be identified from task-free neuroimaging measures. To test this hypothesis, we correlated a reliable behavioral trait, the individual rate of forgetting in long-term memory, with a readily available task-free neuroimaging measure, the resting-state EEG spectrum. Using an established, adaptive fact-learning procedure, the rate of forgetting for verbal and visual materials was measured in a sample of 50 undergraduates from whom we also collected eyes-closed resting state EEG data. A statistical analysis revealed that the individual rates of forgetting were significantly correlated across verbal and visual materials, in agreement previous results. Importantly, both rates correlated with power levels in the alpha (8-13 Hz) and low beta (13-15 Hz) frequency bands, with the correlation between verbal rate of forgetting and low beta power over the right parietal site being significant even when accounting for multiple comparisons. The results suggest that computational models could be individually tailored for prediction using idiographic parameter values derived from inexpensive, task-free imaging recordings.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.