In contrast to simple structures in animal vocal behavior, hierarchical structures such as center-embedded sentences manifest the core computational faculty of human language. Previous artificial grammar learning studies found that the left pars opercularis (LPO) subserves the processing of hierarchical structures. However, it is not clear whether this area is activated by the structural complexity per se or by the increased memory load entailed in processing hierarchical structures. To dissociate the effect of structural complexity from the effect of memory cost, we conducted a functional magnetic resonance imaging study of German sentence processing with a 2-way factorial design tapping structural complexity (with/without hierarchical structure, i.e., center-embedding of clauses) and working memory load (long/short distance between syntactically dependent elements; i.e., subject nouns and their respective verbs). Functional imaging data revealed that the processes for structure and memory operate separately but co-operatively in the left inferior frontal gyrus; activities in the LPO increased as a function of structural complexity, whereas activities in the left inferior frontal sulcus (LIFS) were modulated by the distance over which the syntactic information had to be transferred. Diffusion tensor imaging showed that these 2 regions were interconnected through white matter fibers. Moreover, functional coupling between the 2 regions was found to increase during the processing of complex, hierarchically structured sentences. These results suggest a neuroanatomical segregation of syntax-related aspects represented in the LPO from memory-related aspects reflected in the LIFS, which are, however, highly interconnected functionally and anatomically.DTI ͉ fMRI ͉ hierarchical structure L anguage appears to be a trait specific to humans-at least in its core computational component, that is, grammar. Defining language as a sequence of symbols, Chomsky (1) proposed a hierarchy of grammars as language production mechanisms with increasing generative powers. The lowest-level grammar is finite state grammar (FSG). FSG can be fully specified by transition probabilities between a finite number of states (e.g., words), being not powerful enough to generate structures of natural human languages. Phrase structure grammar (PSG) has more generative power than FSG. A key difference between FSG and PSG is that only PSG can generate the sequence A n B n , where A and B denote symbols and n the number of repetitions. The ability to process the sequence A n B n is crucial for the processing of center-embedded sentences, such as ''The man the boy the dog bit greeted is my friend.'' where subjects (i.e., the man, the boy, and the dog) are A-symbols and the verbs (bit, greeted, and is) are B-symbols. Surprisingly, tests on monkeys (2) and on songbirds (3) showed that whereas songbirds can process A n B n sequences, monkeys cannot. However, even if the birds could correctly discriminate A n B n sequences from A n B m , (4 Ͼ n, m Ͼ0, n m),...