Human activity recognition (HAR) based on sensor data is a significant problem in pervasive computing. In recent years, deep learning has become the dominating approach in this field, due to its high accuracy. However, it is difficult to make accurate identification for the activities of one individual using a model trained on data from other users. The decline on the accuracy of recognition restricts activity recognition in practice. At present, there is little research on the transferring of deep learning model in this field. This is the first time as we known, an empirical study was carried out on deep transfer learning between users with unlabeled data of target. We compared several widely-used algorithms and found that Maximum Mean Discrepancy (MMD) method is most suitable for HAR. We studied the distribution of features generated from sensor data. We improved the existing method from the aspect of features distribution with center loss and get better results. The observations and insights in this study have deepened the understanding of transfer learning in the activity recognition field and provided guidance for further research.
Activity recognition has become a popular research branch in the field of pervasive computing in recent years. A large number of experiments can be obtained that activity sensor-based data's characteristic in activity recognition is variety, volume, and velocity. Deep learning technology, together with its various models, is one of the most effective ways of working on activity data. Nevertheless, there is no clear understanding of why it performs so well or how to make it more effective. In order to solve this problem, first, we applied convolution neural network on Human Activity Recognition Using Smart phones Data Set. Second, we realized the visualization of the sensor-based activity's data features extracted from the neural network. Then we had in-depth analysis of the visualization of features, explored the relationship between activity and features, and analyzed how Neural Networks identify activity based on these features. After that, we extracted the significant features related to the activities and sent the features to the DNN-based fusion model, which improved the classification rate to 96.1%. This is the first work to our knowledge that visualizes abstract sensor-based activity data features. Based on the results, the method proposed in the paper promises to realize the accurate classification of sensorbased activity recognition.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.