Using a large number of power amplifiers brings serious nonlinear distortion to 5G communication systems, which becomes a challenging problem for conventional digital predistortion (DPD) to linearize signals. To address this challenge, we propose a digitally assisted nonlinearity suppression architecture for applications in multiple antenna arrays. In this architecture, we use an auxiliary array to generate a nonlinear cancelation signal to cancel with the main beam at the receiving end. We explore a two‐step method to model the nonlinear signal and combine neural network to complete the modeling. In addition, we introduce a novel method for estimating and tracking the channel to update the model coefficients. To verify the proposed methods, we design a scene and compare the performance of our architecture with that of conventional DPD. Our experiments demonstrate that our proposed methods can significantly enhance the adjacent channel power leakage ratio performance by 12 dB while also optimizing in‐band distortion. Moreover, our architecture outperforms conventional DPD approaches even with enhanced distortion.
In recent years, IIoT scenarios show increasing demand for high-precision positioning. Cellular-based positioning plays a very important role in this application area. The cellular-based positioning system can be categorized as an asymmetric processing system. In this system, multiple base stations transmit positioning reference signaling to one device and the device derives its coordinates based on the measurement of the positioning reference signal by using certain positioning algorithms. The classical positioning algorithms encounter great challenges in satisfying the stringent precision requirement. Meanwhile, artificial intelligence (AI)-based positioning solutions draw great attention due to their strong ability in improving positioning accuracy in IIoT scenarios. For AI-based algorithms, generalization capability is one important metric. It demonstrates the ability to adapt to different environments. However, there is little literature touching on the investigation of generalization capability. In this article, we will tackle this issue by considering typical features in IIoT scenarios including the clutter distribution, network synchronization error, and receiving timing error. The impact of these typical features on the generalization capability is firstly evaluated and then the feasibility of existing popular generalization improvement solutions, i.e., optimized training data set and fine-tuning are tested under different cases. At last, directions to further guarantee the generalization capability are presented. The results of this article provide useful experience for developing AI models for positioning in realistic IIoT scenarios.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.