Accurate classification of human activities is significant particularly for remote monitoring of patients requiring orthopedic treatment and therapies in case of various injuries. Previously, we presented a Local Energy-based Shape Histogram (LESH) based approach that considers the energy expenditure for various activities and differentiates among the activities on the basis of energy level. Although the approach effectively recognized various activities, the recognition accuracy for the activities, such as walking forward, walk forward left circle, and walk forward right circle was significantly low. In this paper, we present a Convolutional Neural Network (CNN) based approach for recognizing human activities with sufficiently high accuracy. The experiments are conducted on the Wearable Action Recognition Database (WARD) dataset. Experimental results demonstrate that the CNN based approach (in general) not only achieves high recognition accuracy for all of the activities but also performed extremely well for the activities requiring frequent inter-posture transitions. Another important contribution of this research is that the CNN based approach is capable of finding a single combination of sensors to identify thirteen different activities with sufficiently high accuracy.