Current theories on lexical recognition are mostly based on studies from spoken languages or their written forms. Much less is known about the process of lexical recognition in sign languages. This study aims to examine the neural correlates of sign recognition by investigating the effects of lexical frequency, length, phonological neighborhood density, and iconicity during Chinese Sign Language comprehension. Twenty-two deaf signers viewed a set of sign videos that varied in the 4 lexical properties and decided if they referred to animals, while event-related potential responses were recorded. Data were analyzed through linear mixed-effects models with the lexical variables treated as continuous measures. The results showed that frequency modulated ERP amplitude as early as around 200 ms and in the late N400 time frame. Sign length invoked effects throughout the process, starting from 200 ms and pertaining to the last epoch. Neighborhood density effects were also observed early around 200 ms and later on the N400 and late positive complex (LPC). Iconicity produced robust effects on the N400 and LPC amplitude. Lexical frequency, length, and neighborhood density influence the neural dynamics of sign recognition in a similar way as to spoken words. Iconicity can confer a processing advantage due to closer form-meaning mappings. The results indicate that lexical recognition engages some mechanisms that are universal across the signed and spoken modality, but it can also be regulated by modality-specific properties such as the prevalent iconicity in sign languages.