Proceedings of the Sixth Annual Conference on Computational Learning Theory - COLT '93 1993
DOI: 10.1145/168304.168357
|View full text |Cite
|
Sign up to set email alerts
|

Rate of approximation results motivated by robust neural network learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

1
30
0

Year Published

1995
1995
2005
2005

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 50 publications
(31 citation statements)
references
References 12 publications
1
30
0
Order By: Relevance
“…We now restate in terms of G-variation the Maurey-Jones-Barron theorem [24], [10], [2] and its extension to L p -spaces [7]. Theorem 5.1.…”
Section: Rates Of Decrease Of Infima With Increasing Complexity Of Admentioning
confidence: 99%
“…We now restate in terms of G-variation the Maurey-Jones-Barron theorem [24], [10], [2] and its extension to L p -spaces [7]. Theorem 5.1.…”
Section: Rates Of Decrease Of Infima With Increasing Complexity Of Admentioning
confidence: 99%
“…The next theorem is a reformulation in terms of G-variation of the estimates derived for Hilbert spaces by Maurey, Jones, and Barron and of an extension of these estimates to L p -spaces, p ∈ (1, ∞), derived by Darken et al [28,Theorem 5]. For the proof, see the appendix.…”
mentioning
confidence: 92%
“…Also, a new branch of nonlinear approximation theory investigating approximation capabilities of neural networks was developed [11,12,21,28,38,45,50,51,52,53,54,55,56,57,58]. In a series of papers [3,5,8,9,10,64,65,66,80,81], a new method of approximate optimization was developed, called in [81] the extended Ritz method.…”
mentioning
confidence: 99%
“…In robust estimation, the advantages of norms different from quadratic are well-known. This motivated the work [37], [14], which established good rates of approximation when measuring errors in several LP norms as well as limitations of the "greedy" or "incremental" technique suggested by Jones in his study of neural nets and projection-pursuit estimation. The work in [37], [14] is based on ideas from the theory of stochastic processes on function spaces and techniques related to moduli of smoothness in Banach spaces.…”
Section: Approximation Ratesmentioning
confidence: 97%