Objective First-degree relatives of persons with an autism spectrum disorder (ASD) are at increased risk for ASD-related characteristics. As little is known about the early expression of these characteristics, this study characterizes the non-ASD outcomes of 3-year-old high-risk (HR) siblings of children with ASD. Method Two groups of children without ASD participated: 507 HR siblings and 324 low-risk (LR) control subjects (no known relatives with ASD). Children were enrolled at a mean age of 8 months, and outcomes were assessed at 3 years. Outcome measures were Autism Diagnostic Observation Schedule (ADOS) calibrated severity scores, and Mullen Verbal and Non-Verbal Developmental Quotients (DQ). Results At 3 years, HR siblings without an ASD outcome exhibited higher mean ADOS severity scores and lower verbal and non-verbal DQs than LR controls. HR siblings were over-represented (21% HR versus 7% LR) in latent classes characterized by elevated ADOS severity and/or low to low-average DQs. The remaining HR siblings without ASD outcomes (79%) belonged to classes in which they were not differentially represented with respect to LR siblings. Conclusions Having removed a previously identified 18.7% of HR siblings with ASD outcomes from all analyses, HR siblings nevertheless exhibited higher mean levels of ASD severity and lower levels of developmental functioning than LR children. However, the latent class membership of four-fifths of the HR siblings was not significantly different from that of LR control subjects. One-fifth of HR siblings belonged to classes characterized by higher ASD severity and/or lower levels of developmental functioning. This empirically derived characterization of an early-emerging pattern of difficulties in a minority of 3-year-old HR siblings suggests the importance of developmental surveillance and early intervention for these children.
Objective Numerous studies have identified abnormal gaze in individuals with autism. Yet only a limited number of findings have been replicated, the magnitude of effects is unclear, and the pattern of gaze differences across stimuli remains poorly understood. To address these gaps, we conducted a comprehensive meta-analysis of autism eye tracking studies. Method PubMed and manual search of 1,132 publications were used to identify studies comparing looking behavior to social and/or nonsocial stimuli between individuals with autism and controls. Sample characteristics, eye tracking methods, stimulus features, and regions-of-interest (ROI) were coded for each comparison within each study. Multivariate mixed-effects meta-regression analyses examined the impact of study methodology, stimulus features, and ROI on effect sizes derived from comparisons using gaze fixation metrics. Results The search revealed 122 independent studies with 1,155 comparisons. Estimated effect sizes tended to be small-to-medium, but varied substantially across stimuli and ROI. Overall, nonsocial ROIs yielded larger effect sizes than social ROIs; however, eye and whole face regions from stimuli with human interaction produced the largest effects (Hedge’s g=.47 and .50, respectively). Studies with weaker study designs/reporting yielded larger effects, but key effects remained significant and medium-sized, even for high-rigor designs. Conclusion Individuals with autism show a reliable pattern of gaze abnormalities that suggests a basic problem with selecting socially-relevant versus irrelevant information for attention and that is persistent across age and worsens during perception of human interactions. Aggregation of gaze abnormalities across stimuli and ROI could yield clinically useful risk assessment and quantitative, objective outcome measures.
Emotion recognition was investigated in typically developing individuals and individuals with autism. Experiment 1 tested children (5 to 7 years, n = 37) with brief video displays of facial expressions that varied in subtlety. Children with autism performed worse than the control children. In Experiment 2, three age groups (8 to 12 years, n = 49; 13 to 17 years, n = 49; and adults n = 45) were tested on the same stimuli. Whereas the performance of control individuals was best in the adult group, the performance of individuals with autism was similar in all age groups. Results are discussed with respect to underlying cognitive processes that may be affecting the development of emotion recognition in individuals with autism.
A multiple habituation paradigm was used to determine whether 10--12-month-old infants were able to discriminate between visual arrays which differed only in their numerosity (2 vs. 3, 3, vs. 4, or 4 vs. 5 items). 96 infants were tested in one of two conditions. In the heterogeneous condition, infants were habituated to a series of slides in which only the number of items remained invariant, while the item type (e.g., dogs, houses, etc.), size, and position varied on each slide. In the homogeneous condition, both the item type (chicks) and number remained invariant, while the size and position of the stimuli varied. Infants in both conditions were then tested with slides which contained either N + 1 or N - 1 items. The results demonstrated that, regardless of condition (homogeneous/heterogeneous), infants were able to discriminate between 2 and 3 items and unable to discriminate between 4 and 5 items. For the 3 versus 4 discrimination, a condition x sex interaction indicated that females discriminated between the items in the homogeneous condition while males were able to make the discrimination in the heterogeneous condition. Since the subjects in this study were preverbal infants, the results suggest that early counting skills are preceded by a more perceptual awareness of numerosity.
The primary purpose of this study was to investigate whether preverbal infants, when presented with exemplars of an artificially constructed category, would abstract a prototypical representation of the category, and if so, whether this representation was formed by either "counting" or "averaging" the features that were varying among category members. Two experiments are reported. In Experiment 1, a set of stimuli was developed and tested for which it was demonstrated that adult subjects would readily abstract either a modal or an average prototypical representation. The type of representation abstracted was found to be dependent on the discriminability of the feature values. In Experiment 2, lO-mo.-old infants were tested using a habituation paradigm with the stimuli developed in the first experiment. The results of this study indicated that the infants were also able to abstract the featural information that was varying among the exemplars of the category, and the infants formed an internal representation of the category by averaging feature values. Thus, the results clearly imply that infants are able to constructively process visual information and hence take a. more active role in category formation than had been previously believed.Children at a young age are aware of many conceptual categories that exist in the world. For example, even a preverbal 11-or 12-mo.-old infant responds in a behaviorally appropriate manner to categories such as food, toys, and people. But how does the young child initially learn This article is based on a dissertation submitted in partial fulfillment of requirements for the PhD degree at the University of Illinois at Urbana-Champaign.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.