2023
DOI: 10.1109/tmtt.2022.3232895
|View full text |Cite
|
Sign up to set email alerts
|

Bridging the Gap Between Artificial Neural Networks and Kernel Regressions for Vector-Valued Problems in Microwave Applications

Abstract: Thanks to their convex formulation, kernel regressions have shown an improved accuracy with respect to artificial neural network (ANN) structures in regression problems where a reduced set of training samples are available. However, despite the above interesting features, kernel regressions are inherently less flexible than ANN structures since their implementations are usually limited to scalar-output regression problems. This article presents a vector-valued (multioutput) formulation of the kernel ridge regr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…An alternative modeling approach consists of applying a vector-valued formulation of the KRR, which allows to directly account for the vector-valued nature of the regression problem at hand [6], [7], [17]. Specifically, the vector-valued KRR can be directly adopted to train a single vector-valued surrogate model M KRR : X → Y, which according to the represented theorem for vector-valued regression problems presented in [44], writes…”
Section: Vector-valued Krrmentioning
confidence: 99%
See 2 more Smart Citations
“…An alternative modeling approach consists of applying a vector-valued formulation of the KRR, which allows to directly account for the vector-valued nature of the regression problem at hand [6], [7], [17]. Specifically, the vector-valued KRR can be directly adopted to train a single vector-valued surrogate model M KRR : X → Y, which according to the represented theorem for vector-valued regression problems presented in [44], writes…”
Section: Vector-valued Krrmentioning
confidence: 99%
“…Many of the first successful approaches were based on polynomial chaos expansion (for an overview of this family of algorithms see [3]). These were followed by other machine learning inspired solutions falling mainly into the category of kernel machine regression (e.g., [4], [5], [6], [7], [8], [9]) and artificial neural networks (ANNs) (e.g., [10], [11], [12], [13], [14]). This article focuses on kernel machine regression techniques, such as: the support vector machine (SVM) regression [15], the least-squares support vector machine (LS-SVM) [16] regression, and the more recent vector-valued kernel ridge regression (KRR) [6], [7], [17], which have been proven particularly effective for various microelectronics and radio-frequency applications [4], [5], [8].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…A slow AUX turn-on often leads to a significant gain drop in both back-off and the Doherty region [6,18,21]. In conclusion, the DPA design is affected by several competing constrains calling for advanced optimization methods, most recently including neural networks [22][23][24] and digital control [25,26].…”
Section: Introductionmentioning
confidence: 99%