In nature, information from the various senses is utilised by intelligent beings for identification, recognition and decision. It is easier to identify an object or a threat by mixing the auditory, visual and tactile information, than when only one of the senses are used. Combining separate sources of information, which include different types of data, provides the variety needed for cognition and improved situational awareness. This is the rationale behind sensor fusion [1].For deployment in real-world healthcare facilities, either in private homes or in hospital and care homes, activity-monitoring systems with a single sensor may not deliver the necessary robust performance requirements. Radar sensing is an emerging approach in the field of assisted living applications [2,3], but limitations in identifying precise movements, especially at unfavourable aspect angles, and a narrow area of detection, mean further co-operative sensing methods are required for a robust system for detection of movements and falls. Sensor fusion is, therefore, one method of mitigating this issue. This chapter will explore the different sensor and fusion topologies, using active radar sensing in conjunction with other sensing technologies as support for assisted living and healthcare applications [2]. Initially, the sensors and their outputs will be described, with specifics of signal processing of the different sensors and the machine learning for classification detailed. The results of applying these methods to the assisted living scenario will then be presented.This chapter will give insight into activity classification with radar and additional sensing technologies, in particular, wearable inertial and magnetic sensors, focusing on the key information fusion approaches and main improvements using experimental data as validation.