Hand gesture recognition based on micro‐Doppler (MD) radar has garnered considerable attention from researchers as a potential method for human–computer interaction (HCI). However, two significant challenges, that is, obvious differences in MD map size and lots of redundant information contained in the MD maps, are encountered. Here, a multi‐scale graph‐based hand gesture recognition framework is proposed. First, a MD graph representation method is developed, which is adapted to arbitrary‐sized frames and enables to map the key features in a sparse manner. Then, the multi‐scale information from MD graph is fully extracted for hand gesture recognition. Experimental results show that the proposed framework achieves a state‐of‐the‐art accuracy of 96.73% on the four‐class radar hand gesture dataset, while reducing up to 99% of the redundant information in the MD maps. This framework requires only a little amount of memory storage with good hand gesture recognition capability, demonstrating its high potential in the HCI field.