2015
DOI: 10.1049/el.2015.0276
|View full text |Cite
|
Sign up to set email alerts
|

Digital predistortion method combining memory polynomial and feed‐forward neural network

Abstract: A baseband digital predistortion (DPD) technique based on a feedforward neural network (FFNN) is presented. The process of memory polynomial (MP) DPD is time consuming because of the large number of mathematical calculations. The FFNN is adopted to realise the mathematical calculations in MP DPD with direct learning architecture (DLA). The training samples of the FFNN are derived from MP DPD with DLA. It guarantees the accuracy of imitating the MP DPD. Although the training of the FFNN is time consuming, the t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 10 publications
0
3
0
Order By: Relevance
“…It is a technique based on a program that learns from data and that works similarly to the way the human brain works. The common neuronal networks are feedforward neural networks (FFNNs), 17 convolutional neural networks (CNNs), 18 recurrent neural networks (RNNs), 19 deep belief networks (DBNs), 20 generative adversarial networks (GANs), 21 and spiking neural networks (SNNs). 22 This study uses a back propagation (BP) neural network, which is one of the FFNNs.…”
Section: Composition Of Neural Network Prediction Modelmentioning
confidence: 99%
“…It is a technique based on a program that learns from data and that works similarly to the way the human brain works. The common neuronal networks are feedforward neural networks (FFNNs), 17 convolutional neural networks (CNNs), 18 recurrent neural networks (RNNs), 19 deep belief networks (DBNs), 20 generative adversarial networks (GANs), 21 and spiking neural networks (SNNs). 22 This study uses a back propagation (BP) neural network, which is one of the FFNNs.…”
Section: Composition Of Neural Network Prediction Modelmentioning
confidence: 99%
“…We refer to our architecture as an Attention Guided Memory Polynomial Neural Network (AGMPNN). Efforts have previously been made to combine the robustness and efficiency of the MPM with NN assisted solutions in different settings: typically trained in a closedloop manner and revolved around modelling the MPM by a very small multilayer perceptron [8], or cascading a NN DPD model together with the MPM to boost its predistortion capability. Our solution is adapted for the ILA and is an end-toend DPD module.…”
Section: Neural Attention Assisted Dpdmentioning
confidence: 99%
“…A feature that unifies all these architectures is that the NN forms part of the DPD datapath, therefore, corresponding calculations have to be performed at the clock rate of the baseband signal. Given that most NN architectures are considerably larger and more complex than polynomial models, this represents a drawback, especially in terms of power consumption, and attempts have been made to alleviate this issue by sparsifying the network connections, or reducing the sampling rate of the DPD calculations [14][15][16][17][18][19][20][21][22][23][24]. In this paper, we propose a new approach that employs an artificial NN for estimating the polynomial coefficients.…”
Section: Introductionmentioning
confidence: 99%