Automatic analysis of body movement to identify physical activity of patients who are at bed rest is crucial for treatment or rehabilitation purposes. Existing methods of physical activity analysis mostly focused on the detection of primitive motion/non-motion states in unimodal video data captured by either RGB or depth or thermal sensor. In this paper, we propose a multimodal visionbased approach to classify body motion of a person lying on a bed. We mimicked a real scenario of 'patient on bed' by recording multimodal video data from healthy volunteers in a hospital room in a neurorehabilitation center. We first defined a taxonomy of possible physical activities based on observations of patients with acquired brain injuries. We then investigated different motion analysis and machine learning approaches to classify physical activities automatically. A multimodal database including RGB, depth and thermal videos was collected and annotated with eight predefined physical activities. Experimental results show that we can achieve moderately high accuracy (77.68%) to classify physical activities by tracking the body motion using an optical flow-based approach. To the best of our knowledge this is the first multimodal RGBDT video analysis for such application.