As in many areas of science, infant research suffers from low power. The problem is further compounded in infant research because of the difficulty in recruiting and testing large numbers of infant participants. Researchers have been searching for a solution and, as illustrated by this special section, have been focused on getting the most out of infant data. We illustrate one solution by showing how we can increase power in visual preference tasks by increasing the amount of data obtained from each infant. We discuss issues of power and present work examining how, under some circumstances, power is increased by increasing the precision of measurement. We report the results of a series of simulations based on a sample of visual preference task data collected from three infant laboratories showing how more powerful research designs can be achieved by including more trials per infant. Implications for infant procedures in general are discussed.
Many aspects of infant development are assessed using infant looking times to visual and audiovisual stimuli. In this article, we describe a stand-alone software package that allows simultaneous stimulus presentation to infants and recording of their looking times via a keypress by a human observer. The software was developed to run both on 64-bit Intel-based Macs running Mac OS/X 10.10 (Yosemite) or later and on 64-bit Windows 7 and 10. It can present a variety of visual and/or auditory stimuli; is customizable with respect to how trials are initiated, how trial lengths are defined, and the phases of the experiment; and can be used to record looking times online or after the fact, as well as to assess the reliability of coding. The software is freely available at http://habit.ucdavis.edu.
Domain-general theories of autism rest on evidence that the disorder impacts not only social communication skills but also nonsocial functions such as memory. Yet recognition memory deficits have been inconsistently documented, especially for stimuli other than faces and sentences. Here we tested school-age children with high-functioning autism (ASD) and IQ, and age-matched comparison children on a visual long-term memory task involving more than 100 photographs of objects, faces, cats, houses, and abstract stimuli. Children viewed each photograph for 2 s. After a 10-min filled delay, we assessed recognition memory for object category as well as for specific exemplars. Data supported the presence of a high-capacity and high-precision visual memory in children with ASD. Both category memory and exemplar memory accuracies were above 90% for categories for which a single exemplar had been encoded. When more exemplars per category were encoded, category memory improved, but exemplar memory declined. An exception was face memory, which remained highly accurate even after many faces had been encoded. Our study provided no evidence that visual memory in general, and face memory in particular, is impaired in children with ASD.
We investigated limitations in young infants’ visual short‐term memory (VSTM). We used a one‐shot change detection task to ask whether 4‐ and 8.5‐month‐old infants (N = 59) automatically encode fixated items in VSTM. Our task included trials that consisted of the following sequence: first a brief (500 ms) presentation with a sample array of two items, next a brief (300 ms) delay period with a blank screen, and finally a test array (2,000 ms) identical to the sample array except that the color of one of the two items is changed. In Experiment 1, we induced infants to fixate one item by rotating it during the sample (the other item remained stationary). In Experiment 2, none of the items rotated. In both experiments, 4‐month‐old infants looked equally at the fixated item when it did and did not change color, providing no evidence that they encoded in VSTM the fixated item. In contrast, 8.5‐month‐old infants in Experiment 1 preferred the fixated item when it changed color from sample to test. Thus, 4‐month‐old infants do not appear to automatically encode fixated items in VSTM.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.