Sign languages used by deaf communities around the world possess the same structural and organizational properties as spoken languages: In particular, they are richly expressive and also tightly grammatically constrained. They therefore offer the opportunity to investigate the extent to which the neural organization for language is modality independent, as well as to identify ways in which modality influences this organization. The fact that sign languages share the visual-manual modality with a nonlinguistic symbolic communicative system-gesture-further allows us to investigate where the boundaries lie between language and symbolic communication more generally. In the present study, we had three goals: to investigate the neural processing of linguistic structure in American Sign Language (using verbs of motion classifier constructions, which may lie at the boundary between language and gesture); to determine whether we could dissociate the brain systems involved in deriving meaning from symbolic communication (including both language and gesture) from those specifically engaged by linguistically structured content (sign language); and to assess whether sign language experience influences the neural systems used for understanding nonlinguistic gesture. The results demonstrated that even sign language constructions that appear on the surface to be similar to gesture are processed within the left-lateralized frontal-temporal network used for spoken languages-supporting claims that these constructions are linguistically structured. Moreover, although nonsigners engage regions involved in human action perception to process communicative, symbolic gestures, signers instead engage parts of the languageprocessing network-demonstrating an influence of experience on the perception of nonlinguistic stimuli.S ign languages such as American Sign Language (ASL) are natural human languages with linguistic structure. Signed and spoken languages also largely share the same neural substrates, including left hemisphere dominance revealed by brain injury and neuroimaging. At the same time, sign languages provide a unique opportunity to explore the boundaries of what, exactly, "language" is. Speech-accompanying gesture is universal (1), yet such gestures are not language-they do not have a set of structural components or combinatorial rules and cannot be used on their own to reliably convey information. Thus, gesture and sign language are qualitatively different, yet both convey symbolic meaning via the hands. Comparing them can help identify the boundaries between language and nonlinguistic symbolic communication.Despite this apparently clear distinction between sign language and gesture, some researchers have emphasized their apparent similarities. One construction that has been a focus of contention is "classifier constructions" (also called "verbs of motion"). In ASL, a verb of motion (e.g., moving in a circle) will include a root expressing the motion event, morphemes marking the manner and direction of motion (e.g., forward o...