Autism is a developmental disorder evident from infancy. Yet, its clinical identification requires expert diagnostic training. New evidence indicates disruption to motor timing and integration may underpin the disorder, providing a potential new computational marker for its early identification. In this study, we employed smart tablet computers with touch-sensitive screens and embedded inertial movement sensors to record the movement kinematics and gesture forces made by 37 children 3–6 years old with autism and 45 age- and gender-matched children developing typically. Machine learning analysis of the children’s motor patterns identified autism with up to 93% accuracy. Analysis revealed these patterns consisted of greater forces at contact and with a different distribution of forces within a gesture, and gesture kinematics were faster and larger, with more distal use of space. These data support the notion disruption to movement is core feature of autism, and demonstrate autism can be computationally assessed by fun, smart device gameplay.
Recently, Windey, Gevers, and Cleeremans (2013) proposed a level of processing (LoP) hypothesis claiming that the transition from unconscious to conscious perception is influenced by the level of processing imposed by task requirements. Here, we carried out two experiments to test the LoP hypothesis. In both, participants were asked to classify briefly presented pairs of letters as same or different, based either on the letters' physical features (a low-level task), or on a semantic rule (a high-level task). Stimulus awareness was measured by means of the four-point Perceptual Awareness Scale (PAS). The results showed that low or moderate stimulus visibility was reported more frequently in the low-level task than in the high-level task, suggesting that the transition from unconscious to conscious perception is more gradual in the former than in the latter. Therefore, although alternative interpretations remain possible, the results of the present study fully support the LoP hypothesis.
This article discusses how the analysis of interactions between action and awareness allows us to better understand the mechanisms of visual awareness. We argue that action is one of several factors that influence visual awareness and we provide a number of examples. We also discuss the possible mechanisms that underlie these influences on both the cognitive and the neural levels. We propose that action affects visual awareness for the following reasons: (1) it serves as additional information in the process of evidence accumulation; (2) it restricts the number of alternatives in the decisional process; (3) it enables error detection and performance monitoring; and (4) it triggers attentional mechanisms that modify stimulus perception. We also discuss the possible neuronal mechanisms of the aforementioned effects, including feedback-dependent prefrontal cortex modulation of the activity of visual areas, error-based modulation, interhemispheric inhibition of motor cortices, and attentional modulation of visual cortex activity triggered by motor processing.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.