Hand gesture, one of the essential ways for a human to convey information and express intuitive intention, has a significant degree of differentiation, substantial flexibility, and high robustness of information transmission to make hand gesture recognition (HGR) one of the research hotspots in the fields of human–human and human–computer or human–machine interactions. Noninvasive, on‐body sensors can monitor, track, and recognize hand gestures for various applications such as sign language recognition, rehabilitation, myoelectric control for prosthetic hands and human–machine interface (HMI), and many other applications. This article systematically reviews recent achievements from noninvasive upper‐limb sensing techniques for HGR, multimodal sensing fusion to gain additional user information, and wearable gesture recognition algorithms to obtain more reliable and robust performance. Research challenges, progress, and emerging opportunities for sensor‐based HGR systems are also analyzed to provide perspectives for future research and progress.