The human capacity to acquire language is an outstanding scientific challenge to understand. Somehow our language capacities arise from the way the human brain processes, develops and learns in interaction with its environment. To set the stage, we begin with a summary of what is known about the neural organization of language and what our artificial grammar learning (AGL) studies have revealed. We then review the Chomsky hierarchy in the context of the theory of computation and formal learning theory. Finally, we outline a neurobiological model of language acquisition and processing based on an adaptive, recurrent, spiking network architecture. This architecture implements an asynchronous, event-driven, parallel system for recursive processing. We conclude that the brain represents grammars (or more precisely, the parser/generator) in its connectivity, and its ability for syntax is based on neurobiological infrastructure for structured sequence processing. The acquisition of this ability is accounted for in an adaptive dynamical systems framework. Artificial language learning (ALL) paradigms might be used to study the acquisition process within such a framework, as well as the processing properties of the underlying neurobiological infrastructure. However, it is necessary to combine and constrain the interpretation of ALL results by theoretical models and empirical studies on natural language processing. Given that the faculty of language is captured by classical computational models to a significant extent, and that these can be embedded in dynamic network architectures, there is hope that significant progress can be made in understanding the neurobiology of the language faculty.Keywords: implicit artificial grammar learning; fMRI; repeated transcranial magnetic stimulation; language-related genes; CNTNAP2; spiking neural networks
INTRODUCTIONRecent years have seen a renewed interest in using artificial grammar learning (AGL) as a window onto the organization of the language system. It has been exploited in cross-species comparisons, but also in studies on the neural architecture for language. Our focus is on the role AGL can play in unravelling the neural basis of human language. For this purpose, its role is relatively limited and mainly restricted to modelling aspects of structured sequence learning and structured sequence processing, uncontaminated by the semantic and phonological sources of information that co-determine the production and comprehension of natural language. Before going into more details related to the neurobiology of syntax and the role of AGL research, we outline what we think are the major conclusions from the research on the neurobiology of language: