We examined maturation of speech-sound-related indices of auditory event-related brain potentials (ERPs). ERPs were elicited by syllables and nonphonetic correlates in children and adults. Compared with syllables, nonphonetic stimuli elicited larger N1 and P2 in adults and P1 in children. Because the nonphonetics were more perceptually salient, this N1 effect was consistent with known N1 sensitivity to sound onset features. Based on stimulus dependence and independent component structure, children's P1 appeared to contain overlapping P2-like activity. In both subject groups, syllables elicited larger N2/N4 peaks. This might reflect sound content feature processing, more extensive for speech than nonspeech sounds. Therefore, sound detection mechanisms (N1, P2) still develop whereas sound content processing (N2, N4) is largely mature during mid-childhood; in children and adults, speech sounds are processed more extensively than nonspeech sounds 200-400 ms poststimulus.
Objective-Event-related brain potentials (ERP) may provide tools for examining normal and abnormal language development. To clarify functional significance of auditory ERPs, we examined ERP indices of spectral differences in speech and non-speech sounds.Methods-Three Spectral Items (BA, DA, GA) were presented as three Stimulus Types: syllables, non-phonetics, and consonant-vowel transitions. Fourteen 7-10-year-old children and 14 adults were presented with equiprobable Spectral Item sequences blocked by Stimulus Type.Results-Spectral Item effect appeared as P1, P2, N2, and N4 amplitude variations. The P2 was sensitive to all Stimulus Types in both groups. In adults, the P1 was also sensitive to transitions while the N4 was sensitive to syllables. In children, only the 50-ms CVT stimuli elicited N2 and N4 spectral effects. Non-phonetic stimuli elicited larger N1-P2 amplitudes while speech stimuli elicited larger N2-N4 amplitudes.Conclusions-Auditory feature processing is reflected by P1-P2 and N2-N4 peaks and matures earlier than supra-sensory integrative mechanisms, reflected by N1-P2 peaks. Auditory P2 appears to pertain to both processing types.Significance-These results delineate an orderly processing organization whereby direct feature mapping occurs earlier in processing and, in part, serves sound detection whereas relational mapping occurs later in processing and serves sound identification.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.