In this study of the project DyAdd, implicit learning was investigated through two paradigms in adults (18-55 years) with dyslexia (n = 36) or with attention deficit/hyperactivity disorder (ADHD, n = 22) and in controls (n = 35). In the serial reaction time (SRT) task, there were no group differences in learning. However, those with ADHD exhibited faster RTs compared to other groups. In the artificial grammar learning (AGL) task, the groups did not differ from each other in their learning (i.e., grammaticality accuracy or similarity choices). Further, all three groups were sensitive to fragment overlap between learning and test-phase items (i.e., similarity choices were above chance). Grammaticality performance of control participants was above chance, but that of participants with dyslexia and participants with ADHD failed to differ from chance, indicating impaired grammaticality learning in these groups. While the main indices of AGL performance, grammaticality accuracy and similarity choices did not correlate with the neuropsychological variables that reflected dyslexia-related (phonological processing, reading, spelling, arithmetic) or ADHD-related characteristics (executive functions, attention), or intelligence, the explicit knowledge for the AGL grammar (i.e., ability to freely generate grammatical strings) correlated positively with the variables of phonological processing and reading. Further, SRT reaction times correlated positively with full scale intelligence quotient (FIQ). We conclude that, in AGL, learning difficulties of the underlying rule structure (as measured by grammaticality) are associated with dyslexia and ADHD. However, learning in AGL is not related to the defining neuropsychological features of dyslexia or ADHD. Instead, the resulting explicit knowledge relates to characteristics of dyslexia.
Latencies of sensory neurons vary depending on stimulus variables such as intensity, contrast, distance and adaptation. Therefore, different parts of an object and simultaneous environmental events could often elicit non-simultaneous neural representations. However, despite the neural discrepancies of timing, our actions and object perceptions are usually veridical. Recent results suggest that this temporal veridicality is assisted by the so-called simultaneity constancy which actively compensates for neural timing asynchronies. We studied whether a corresponding compensation by simultaneity constancy could be learned in natural interaction with the environment without explicit feedback. Brief stimuli, whose objective simultaneity/non-simultaneity was judged, consisted of flashes, clicks or touches, and their cross-modal combinations. The stimuli were presented as two concurrent trains. Twenty-eight adult participants practised unimodal (visual, auditory and tactile) and cross-modal (audiovisual, audiotactile and visuotactile) simultaneity judgement tasks in eight sessions, two sessions per week. Effects of practice were tested 7 months later. All tasks indicated improved judgements of simultaneity that were also long-lasting. This simultaneity learning did not affect relative temporal resolution (Weber fraction). Transfer of learning between practised tasks was minimal, which suggests that simultaneity learning mechanisms are not centralised but modally specific. Our results suggest that natural perceptual learning can generate simultaneity-constancy-like phenomena in a well-differentiated and long-lasting manner and concomitantly in several sensory systems. Hebbian learning can explain how experience with environmental simultaneity and non-simultaneity can develop the veridicality of perceived synchrony.
Involuntary switching of attention to distracting sounds was studied by measuring effects of these events on auditory discrimination performance and event-related brain potentials (ERPs) in 6-11-year-old boys with Attention Deficit-Hyperactivity Disorder (ADHD) and comorbid Oppositional Defiant Disorder (ODD) and in age-matched controls. The children were instructed to differentiate between two animal calls by pressing one response button, for example, to a dog bark and another button to a cat mew. These task-relevant sounds were presented from one of two loudspeakers in front of the child, and there were occasional task-irrelevant changes in the sound location, that is, the loudspeaker. In addition, novel sounds (e.g., a sound of hammer, rain, or car horn) unrelated to the task were presented from a loudspeaker behind the child. The percentage of correct responses was lower for target sounds preceded by a novel sound than for targets not preceded by such sound in the ADHD group, but not in the control group. In both groups, a biphasic positive P3a response was observed in ERPs to the novel sounds. The later part of the P3a appeared to continue longer over the frontal scalp areas in the ADHD group than in the controls presumably because a reorienting negativity (RON) ERP response following the P3a was smaller in the ADHD group than in the control group. This suggests that the children with ADHD had problems in reorienting their attention to the current task after a distracting novel sound leading to deterioration of performance in this task. The present study also indicates that children with ADHD and comorbid ODD show same kind of distractibility as found in previous studies for children with ADHD without systematic comorbid ODD.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.