Categorical judgments of otherwise identical phonemes are biased toward hearing words (i.e., "Ganong effect") suggesting lexical context influences perception of even basic speech primitives. Lexical biasing could manifest via late stage post-perceptual mechanisms related to decision or alternatively, top-down linguistic inference which acts on early perceptual coding. Here, we exploited the temporal sensitivity of EEG to resolve the spatiotemporal dynamics of these context-related influences on speech categorization. Listeners rapidly classified sounds from a /gi/ - /ki/ gradient presented in opposing word-nonword contexts (GIFT-kift vs. giss-KISS), designed to bias perception toward lexical items. Phonetic perception shifted toward the direction of words, establishing a robust Ganong effect behaviorally. ERPs revealed a neural analog of lexical biasing emerging within ~200 ms. Source analyses uncovered a distributed neural network supporting the Ganong including middle temporal gyrus (MTG), inferior parietal lobe (IPL), and middle frontal cortex. Yet, among Ganong-sensitive regions, only left MTG and IPL predicted behavioral susceptibility to lexical influence. Our findings confirm lexical status rapidly constrains sub-lexical categorical representations for speech within several hundred milliseconds but likely does so outside the purview of canonical "auditory-linguistic" brain areas.