Deep Convolutional Neural Networks (DCNNs) are currently popular in human activity recognition (HAR) applications. However, in the face of modern artificial intelligence sensor-based games, many research achievements cannot be practically applied on portable devices (i.e., smart phone, VR/AR). DCNNs are typically resource-intensive and too large to be deployed on portable devices, thus this limits the practical application of complex activity detection. In addition, since portable devices do not possess high-performance Graphic Processing Units (GPUs), there is hardly any improvement in Action Game (ACT) experience. Besides, in order to deal with multi-sensor collaboration, all previous human activity recognition models typically treated the representations from different sensor signal sources equally. However, distinct types of activities should adopt different fusion strategies. In this paper, a novel scheme is proposed. This scheme is used to train 2-bit Convolutional Neural Networks with weights and activations constrained to {-0.5, 0, 0.5}. It takes into account the correlation between different sensor signal sources and the activity types. This model, which we refer to as DFTerNet, aims at producing a more reliable inference and better trade-offs for practical applications. It's known that quantization of weights and activations can substantially reduce memory size and use more efficient bitwise operations to replace floating or matrix operations to achieve much faster calculation and lower power consumption. Our basic idea is to exploit quantization of weights and activations directly in pre-trained filter banks and adopt dynamic fusion strategies for different activity types. Experiments demonstrate that by using a dynamic fusion strategy, it is possible to exceed the baseline model performance by up to ∼5% on activity recognition datasets like the OPPORTUNITY and PAMAP2 datasets. Using the quantization method proposed, we were able to achieve performances closer to that of the full-precision counterpart. These results were also verified using the UniMiB-SHAR dataset. In addition, the proposed method can achieve ∼9× acceleration on CPUs and ∼11× memory saving.
With advances in geo-positioning technologies and geo-location services, there are a rapidly growing massive amount of spatio-temporal data collected in many applications such as location-aware devices and wireless communication, in which an object is described by its spatial location and its timestamp. Consequently, the study of spatio-temporal search which explores both geo-location information and temporal information of the data has attracted significant concern from research organizations and commercial communities. This work study the problem of spatio-temporal k -nearest neighbors search (STkNNS), which is fundamental in the spatial temporal queries. Based on HBase, a novel index structure is proposed, called Hybrid Spatio-Temporal HBase Index (HSTI for short), which is carefully designed and takes both spatial and temporal information into consideration to effectively reduce the search space. Based on HSTI, an efficient algorithm is developed to deal with spatio-temporal k -nearest neighbors search. Comprehensive experiments on real and synthetic data clearly show that HSTI is three to five times faster than the state-of-the-art technique.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.