The need for technologies for Human Activity Recognition (HAR) in home environments is becoming more and more urgent because of the aging population worldwide. Radar-based HAR is typically using micro-Doppler signatures as one of the main data representations, in conjunction with classification algorithms often inspired from deep learning methods. One of the limitations of this approach is the challenging classification of movements at unfavorable aspect angles (i.e., close to 90 • ) and of static postures in between continuous sequences of activities. To address this problem, a hierarchical processing and classification pipeline is proposed to fully exploit all the information available from millimeter-wave (mm-wave) 4D imaging radars, specifically the azimuth and elevation information in conjunction to the more conventional range, Doppler, received power, and time features. The proposed pipeline uses the two complementary data representations of Point Cloud (PC) and spectrogram, and its performance is validated using an experimental dataset with 6 activities performed by 8 participants. The results show good performance of the proposed pipeline compared with alternative baseline approaches in the literature, and the effect of key parameters such as the amount of training data, signal-to-noise levels, and virtual aperture size is investigated. Leave-one-subject-out test is also applied to study the impact of body characteristics on the generalizability of the trained classifiers.