2017
DOI: 10.1142/s0219530516500202
|View full text |Cite
|
Sign up to set email alerts
|

Regularized learning schemes in feature Banach spaces

Abstract: This paper proposes a unified framework for the investigation of constrained learning theory in reflexive Banach spaces of features via regularized empirical risk minimization. The focus is placed on Tikhonov-like regularization with totally convex functions. This broad class of regularizers provides a flexible model for various priors on the features, including in particular hard constraints and powers of Banach norms. In such context, the main results establish a new general form of the representer theorem a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

2
19
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 17 publications
(21 citation statements)
references
References 68 publications
(134 reference statements)
2
19
0
Order By: Relevance
“…Concerning statistical properties (such as consistency, learning rates and generalization upper bounds for the risk) of Banach valued learning algorithms, these were also investigated in [27], [51], [49] [16], albeit only in the case of independent training data.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Concerning statistical properties (such as consistency, learning rates and generalization upper bounds for the risk) of Banach valued learning algorithms, these were also investigated in [27], [51], [49] [16], albeit only in the case of independent training data.…”
Section: Discussionmentioning
confidence: 99%
“…Following [16], [57], as a direction for future work, one can investigate the geometrical properties of the underlying Banach space norm so as to ensure the possibility of learning in the normed space on the one hand, and to satisfy the smoothness assumption A1 on the other. In such a situation the concentration results presented in this paper will apply and have the potential to provide a pivotal tool in the analysis of such schemes for weakly dependent data.…”
Section: Discussionmentioning
confidence: 99%
“…Variational regularization methods with nonquadratic functionals such as total variation or 1 -norms have evolved to a standard tool in inverse problems [10,34], image processing [13], compressed sensing [12], and recently related fields such as learning theory [15]. The popularity of such approaches stems from superior structural properties compared to other regularization approaches.…”
Section: Introductionmentioning
confidence: 99%
“…This type of kernels describe Banach spaces of analytic functions and include generalizations of the exponential and polynomial kernels as well as, in the complex case, generalizations of the Szegö and Bergman kernels.solve the dual problem as well as to compute the solution of the primal (infinite dimensional) problem. This is what it is known as the kernel trick and makes support vector regression effective and so popular in applications [21].Learning in Banach spaces of functions is an emerging area of research which in principle permits to consider learning problems with more general types of norms than Hilbert norms [5,10,27]. The main motivation for this generalization comes from the need of finding more effective sparse representations of data or for feature selection.…”
mentioning
confidence: 99%
“…Learning in Banach spaces of functions is an emerging area of research which in principle permits to consider learning problems with more general types of norms than Hilbert norms [5,10,27]. The main motivation for this generalization comes from the need of finding more effective sparse representations of data or for feature selection.…”
mentioning
confidence: 99%