Wearable devices enable remote, long-term, and unobtrusive monitoring of patients in their everyday living and working environments. Remote health monitoring often involves monitoring physical and cardiac activities (exertions) in order to establish correlations between the two. With recent advances in sensor technologies and machine learning, the efficiency with which these activities can be recognized has been steadily improving. In this paper, we apply Convolutional Neural Networks (CNN) to measurements taken with wireless electrocardiograms and inertial sensors for Human Activity Recognition (HAR). Experimental results confirm that our approach is able to recognize a wide range of everyday activities with a high degree of accuracy. Specifically, activities such as Jumping, Running, and Sitting could be recognized with accuracy exceeding 99%, while activities such as Bending Over, Walking, Standing Up, and Climbing Stairs could be recognized with accuracy exceeding 90%. Overall, the results suggest that the combined use of inertial sensors and ECG leads to a better recognition accuracy. Likewise, the paper closely examines the contributions of individual sensors and if and to what extent their placement affects recognition accuracy.