A convolutional neural network (CNN) is an important and widely utilized part of the artificial neural network (ANN) for computer vision, mostly used in the pattern recognition system. The most important applications of CNN are medical image analysis, image classification, object recognition from videos, recommender systems, financial time series analysis, natural language processing, and human–computer interfaces. However, after the technological advancement in the power of computing ability and the emergence of huge quantities of labeled data provided through enhanced algorithms, nowadays, CNN is widely used in almost every area of study. One of the main uses of wearable technology and CNN within medical surveillance is human activity recognition (HAR), which must require constant tracking of everyday activities. This paper provides a comprehensive study of the application of CNNs in the classification of HAR tasks. We describe their enhancement, from their antecedents up to the current state-of-the-art systems of deep learning (DL). We have provided a comprehensive working principle of CNN for HAR tasks, and a CNN-based model is presented to perform the classification of human activities. The proposed technique interprets data from sensor sequences of inputs by using a multi-layered CNN that gathers temporal and spatial data related to human activities. The publicly available WISDM dataset for HAR has been used to perform this study. This proposed study uses the two-dimensional CNN approach to make a model for the classification of different human activities. A recent version of Python software has been used to perform the study. The rate of accuracy for HAR through the proposed model in this experiment is 97.20%, which is better than the previously estimated state-of-the-art technique. The findings of the study imply that using DL methods for activity recognition might greatly increase accuracy and increase the range of applications where HAR can be used successfully. We have also described the future research trends in the field of HAR in this article.