Lexical retrieval requires selecting and retrieving the most appropriate word from the lexicon to express a desired concept. Prior studies investigating the neuroanatomic underpinnings of lexical retrieval used lesion models that rely on stereotyped vascular distributions, functional neuroimaging methods that lack causal certainty, or awake brain mapping that is typically limited to narrow cortical exposures. Further, few studies have probed lexical retrieval with tasks other than picture naming and when non-picture naming lexical retrieval tasks have been applied, both convergent and divergent models emerged. Because of this existing controversy, we set out to test the hypothesis that cortical and subcortical brain regions specifically involved in lexical retrieval in response to visual and auditory stimuli represent overlapping neural systems. Fifty-three patients with dysnomic aphasia due to dominant-hemisphere brain tumors performed four language tasks: picture naming, auditory naming, text reading, and describing line drawings with correct syntax. A subset of participants also underwent the Quick Aphasia Battery which provides a validated measure of lexical retrieval via the word finding subtest. Generalized linear modeling and principal components analysis revealed multicollinearity between picture naming, auditory naming, and word finding, implying redundancies between the linguistic measures. Support vector regression lesion-symptom mapping across participants was used to model accuracies on each of the four language tasks. Picture naming and auditory naming survived cluster-level corrections. Specifically, lesions within overlapping clusters of 8,333 voxels and 21,512 voxels in the left lateral PFC were predictive of impaired picture naming and auditory naming, respectively. These data indicate a convergence of heteromodal lexical retrieval within the PFC.