Despite immense variability across languages, people can learn to understand any human language, spoken or signed. What neural mechanisms allow people to comprehend language across sensory modalities? When people listen to speech, electrophysiological oscillations in auditory cortex entrain to slow (<8 Hz) fluctuations in the acoustic envelope. Entrainment to the speech envelope may reflect mechanisms specialized for auditory perception. Alternatively, flexible entrainment may be a general-purpose cortical mechanism that optimizes sensitivity to rhythmic information regardless of modality. Here, we test these proposals by examining cortical coherence to visual information in sign language. First, we develop a metric to quantify visual change over time. We find quasiperiodic fluctuations in sign language, characterized by lower frequencies than fluctuations in speech. Next, we test for entrainment of neural oscillations to visual change in sign language, using electroencephalography (EEG) in fluent speakers of American Sign Language (ASL) as they watch videos in ASL. We find significant cortical entrainment to visual oscillations in sign language <5 Hz, peaking at ∼1 Hz. Coherence to sign is strongest over occipital and parietal cortex, in contrast to speech, where coherence is strongest over the auditory cortex. Nonsigners also show coherence to sign language, but entrainment at frontal sites is reduced relative to fluent signers. These results demonstrate that flexible cortical entrainment to language does not depend on neural processes that are specific to auditory speech perception. Low-frequency oscillatory entrainment may reflect a general cortical mechanism that maximizes sensitivity to informational peaks in time-varying signals.sign language | cortical entrainment | oscillations | EEG L anguages differ dramatically from one another, yet people can learn to understand any natural language. What neural mechanisms allow humans to understand the vast diversity of languages and to distinguish linguistic signal from noise? One mechanism that has been implicated in language comprehension is neural entrainment to the volume envelope of speech. The volume envelope of speech fluctuates at low frequencies (< 8 Hz), decreasing at boundaries between syllables, words, and phrases. When people listen to speech, neural oscillations in the delta (1-4 Hz) and theta (4-8 Hz) bands become entrained to these fluctuations in volume (1-4).Entrainment to the volume envelope may represent an active neural mechanism to boost perceptual sensitivity to rhythmic stimuli (2, 5-7). Although entrainment is partly driven by bottom-up features of the stimulus (8-10), it also depends on top-down signals to auditory cortex from other brain areas. Auditory entrainment is strengthened when people see congruent visual and auditory information (11,12) and is modulated by attention (13) and by top-down signals from frontal cortex (4, 14).Cortical entrainment is proposed to perform a key role in speech comprehension, such as segmenting out sylla...