Kernel adaptive filtering (KAF) is an effective nonlinear learning algorithm, which has been widely used in time series prediction. The traditional KAF is based on the stochastic gradient descent (SGD) method, which has slow convergence speed and low filtering accuracy. Hence, a kernel conjugate gradient (KCG) algorithm has been proposed with low computational complexity, while achieving comparable performance to some KAF algorithms, e.g., the kernel recursive least squares (KRLS). However, the robust learning performance is unsatisfactory, when using KCG. Meanwhile, correntropy as a local similarity measure defined in kernel space, can address large outliers in robust signal processing. On the basis of correntropy, the mixture correntropy is developed, which uses the mixture of two Gaussian functions as a kernel function to further improve the learning performance. Accordingly, this article proposes a novel KCG algorithm, named the kernel mixture correntropy conjugate gradient (KMCCG), with the help of the mixture correntropy criterion (MCC). The proposed algorithm has less computational complexity and can achieve better performance in non-Gaussian noise environments. To further control the growing radial basis function (RBF) network in this algorithm, we also use a simple sparsification criterion based on the angle between elements in the reproducing kernel Hilbert space (RKHS). The prediction simulation results on a synthetic chaotic time series and a real benchmark dataset show that the proposed algorithm can achieve better computational performance. In addition, the proposed algorithm is also successfully applied to the practical tasks of malware prediction in the field of malware analysis. The results demonstrate that our proposed algorithm not only has a short training time, but also can achieve high prediction accuracy.
Currently, we have witnessed the rapid development of data-driven machine learning methods, which have achieved very effective results in communication systems. Kernel learning is a typical nonlinear learning method in the machine learning community. This article proposes two novel correntropy-based kernel learning algorithms to improve the accuracy of indoor positioning in WiFi-based wireless networks. In general, correntropy as a measure of local similarity defined in kernel space can be used for robust signal processing to address large outliers. Then, through the combination of the maximum mixture correntropy criterion (MMCC) and online vector quantization (VQ), we develop a learning algorithm, named quantized kernel MMCC (QKMMCC) method, which works with the advantage of correntropy while effectively suppressing the growth of memory structure and reducing the computation in this algorithm using VQ. Furthermore, to fully use redundant information and to further improve the learning accuracy, an intensified QKMMCC, called QKMMCC_BG, is also proposed on the basis of the bilateral gradient (BG) technique. Simulation results show that, compared with some similar approaches, our algorithms can achieve better computational performance. In addition, our proposed algorithms are also applied to indoor positioning of WiFi-based wireless networks. The experimental results show that our kernel learning algorithms can effectively improve the positioning accuracy. The average positioning errors of our two algorithms in the experiment are 0.86 m and 0.76 m, respectively. The effectiveness of our algorithms is further verified.Trans Emerging Tel Tech. 2019;30:e3614.wileyonlinelibrary.com/journal/ett
A multi-target localization algorithm based on compressed sensing was proposed in this paper. The issue of multi-target localization was transformed into compressed sensing. The algorithm greatly reduced the amount of wireless network's communication data by transferring most of the computing work to the central server. This method made full uses of the priori information of the signal and the support set. It combined Kalman filter with Bayesian compressed sensing to improve the localization accuracy and noise immunity. Simulation results showed that the proposed method has good noise immunity, robustness and localization accuracy compared with traditional localization methods.
In this paper, a method is proposed to reduce the impact of delay time and packet loss, because they have serious effects on serviceability of the remote operation mechanical system. The stability of a nonlinear bilateral teleoperation system with force feedback which has the similar combination of structure is discussed, and the system stability conditions are studied. The selfadaptive algorithm is designed by the relations of sampling time, delay time and the variable procedure of the data transfer in the nonlinear discrete teleoperation system. Through the experiment, this method is confirmed to be able to reduce data error and increase accuracy effectively, while it also assures the stability of the system.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.