Learning and processing natural language requires the ability to track syntactic relationships between words and phrases in a sentence, which are often separated by intervening material. These nonadjacent dependencies can be studied using artificial grammar learning paradigms and structured sequence processing tasks. These approaches have been used to demonstrate that human adults, infants and some nonhuman animals are able to detect and learn dependencies between nonadjacent elements within a sequence. However, learning nonadjacent dependencies appears to be more cognitively demanding than detecting dependencies between adjacent elements, and only occurs in certain circumstances. In this review, we discuss different types of nonadjacent dependencies in language and in artificial grammar learning experiments, and how these differences might impact learning. We summarize different types of perceptual cues that facilitate learning, by highlighting the relationship between dependent elements bringing them closer together either physically, attentionally, or perceptually. Finally, we review artificial grammar learning experiments in human adults, infants, and nonhuman animals, and discuss how similarities and differences observed across these groups can provide insights into how language is learned across development and how these language‐related abilities might have evolved.
From the early stages of life, learning the regularities associated with specific objects is crucial for making sense of experiences. Through filial imprinting, young precocial birds quickly learn the features of their social partners by mere exposure. It is not clear though to what extent chicks can extract abstract patterns of the visual and acoustic stimuli present in the imprinting object, and how they combine them. To investigate this issue, we exposed chicks (Gallus gallus) to three days of visual and acoustic imprinting, using either patterns with two identical items or patterns with two different items, presented visually, acoustically or in both modalities. Next, chicks were given a choice between the familiar and the unfamiliar pattern, present in either the multimodal, visual or acoustic modality. The responses to the novel stimuli were affected by their imprinting experience, and the effect was stronger for chicks imprinted with multimodal patterns than for the other groups. Interestingly, males and females adopted a different strategy, with males more attracted by unfamiliar patterns and females more attracted by familiar patterns. Our data show that chicks can generalize abstract patterns by mere exposure through filial imprinting and that multimodal stimulation is more effective than unimodal stimulation for pattern learning.
The ability to abstract a regularity that underlies strings of sounds is a core mechanism of the language faculty but might not be specific to language learning or even to humans. It is unclear whether and to what extent nonhuman animals possess the ability to abstract regularities defining the relation among arbitrary auditory items in a string and to generalize this abstraction to strings of acoustically novel items. In this study we tested these abilities in a songbird (zebra finch) and a parrot species (budgerigar). Subjects were trained in a go/no-go design to discriminate between two sets of sound strings arranged in an XYX or an XXY structure. After this discrimination was acquired, each subject was tested with test strings that were structurally identical to the training strings but consisted of either new combinations of known elements or of novel elements belonging to other element categories. Both species learned to discriminate between the two stimulus sets. However, their responses to the test strings were strikingly different. Zebra finches categorized test stimuli with previously heard elements by the ordinal position that these elements occupied in the training strings, independent of string structure. In contrast, the budgerigars categorized both novel combinations of familiar elements as well as strings consisting of novel element types by their underlying structure. They thus abstracted the relation among items in the XYX and XXY structures, an ability similar to that shown by human infants and indicating a level of abstraction comparable to analogical reasoning.artificial grammar learning | rule learning | auditory perception | songbirds | parrots O ne of the critical features of language learning is the ability to abstract the grammatical structure from spoken language. Such abstraction allows humans to learn about regularities in their native language and to generalize these regularities to novel input. This ability is examined in a standardized way in artificial grammar learning experiments, in which humans are exposed to strings of meaningless sounds (e.g., arbitrary speech syllables) organized according to a specific grammatical structure. Several studies have shown that the ability to abstract the underlying structure from such stimuli is present in young infants (1-5) in both the acoustic and the visual domain (6-8). This domain generality and its presence at a very early age have given rise to the notion that this cognitive ability may have preceded language evolution and served as a basis for present-day linguistic complexity. If so, it raises the question to what extent this ability is confined to humans or also can be found in nonhuman animals. In this context, comparative studies on nonhuman animals are needed to reveal the level of abstraction they are able to achieve in artificial grammar learning tasks. This information might provide hypotheses about how and why the more complex human grammatical competences have arisen. The current study addresses whether two bird species, the...
Variation in pitch, amplitude and rhythm adds crucial paralinguistic information to human speech. Such prosodic cues can reveal information about the meaning or emphasis of a sentence or the emotional state of the speaker. To examine the hypothesis that sensitivity to prosodic cues is language independent and not human specific, we tested prosody perception in a controlled experiment with zebra finches. Using a go/no-go procedure, subjects were trained to discriminate between speech syllables arranged in XYXY patterns with prosodic stress on the first syllable and XXYY patterns with prosodic stress on the final syllable. To systematically determine the salience of the various prosodic cues (pitch, duration and amplitude) to the zebra finches, they were subjected to five tests with different combinations of these cues. The zebra finches generalized the prosodic pattern to sequences that consisted of new syllables and used prosodic features over structural ones to discriminate between stimuli. This strong sensitivity to the prosodic pattern was maintained when only a single prosodic cue was available. The change in pitch was treated as more salient than changes in the other prosodic features. These results show that zebra finches are sensitive to the same prosodic cues known to affect human speech perception.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.