2022
DOI: 10.1142/s0219691322500485
|View full text |Cite
|
Sign up to set email alerts
|

Optimality of the rescaled pure greedy learning algorithms

Abstract: We propose the Rescaled Pure Greedy Learning Algorithm (RPGLA) for solving the kernel-based regression problem. The computational complexity of the RPGLA is less than the Orthogonal Greedy Learning Algorithm (OGLA) and Relaxed Greedy Learning Algorithm (RGLA). We obtain the convergence rates of the RPGLA for continuous kernels. When the kernel is infinitely smooth, we derive a convergence rate that can be arbitrarily close to the best rate [Formula: see text] under a mild assumption of the regression function.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 33 publications
0
2
0
Order By: Relevance
“…• • • and N = 1, then the VWRPGA degenerates into the RPGA. In [5], the authors applied the RPGA to a kernel-based regression. They defined the Rescaled Pure Greedy Learning Algorithm (RPGLA) and studied its efficiency.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…• • • and N = 1, then the VWRPGA degenerates into the RPGA. In [5], the authors applied the RPGA to a kernel-based regression. They defined the Rescaled Pure Greedy Learning Algorithm (RPGLA) and studied its efficiency.…”
Section: Discussionmentioning
confidence: 99%
“…Approximation using a sparse linear combination of elements from a fixed redundant family is actively used because of its concise representations and increased computational efficiency. It has been applied widely to signal processing, image compression, machine learning and PDE solvers (see [1][2][3][4][5][6][7][8][9][10]). Among others, simultaneous sparse approximation has been utilized in signal vector processing and multi-task learning (see [11][12][13][14]).…”
Section: Introductionmentioning
confidence: 99%