RF sensing offers an unobtrusive, user-friendly, and privacy-preserving method for detecting accidental falls and recognizing human activities. Contemporary RF-based HAR systems generally employ a single monostatic radar to recognize human activities. However, a single monostatic radar cannot detect the motion of a target, e.g., a moving person, orthogonal to the boresight axis of the radar. Owing to this inherent physical limitation, a single monostatic radar fails to efficiently recognize orientation-independent human activities. In this work, we present a complementary RF sensing approach that overcomes the limitation of existing single monostatic radar-based HAR systems to robustly recognize orientation-independent human activities and falls. Our approach used a distributed mmWave MIMO radar system that was set up as two separate monostatic radars placed orthogonal to each other in an indoor environment. These two radars illuminated the moving person from two different aspect angles and consequently produced two time-variant micro-Doppler signatures. We first computed the mean Doppler shifts (MDSs) from the micro-Doppler signatures and then extracted statistical and time- and frequency-domain features. We adopted feature-level fusion techniques to fuse the extracted features and a support vector machine to classify orientation-independent human activities. To evaluate our approach, we used an orientation-independent human activity dataset, which was collected from six volunteers. The dataset consisted of more than 1350 activity trials of five different activities that were performed in different orientations. The proposed complementary RF sensing approach achieved an overall classification accuracy ranging from 98.31 to 98.54%. It overcame the inherent limitations of a conventional single monostatic radar-based HAR and outperformed it by 6%.