Proceedings of 1994 Workshop on Information Theory and Statistics
DOI: 10.1109/wits.1994.513898
|View full text |Cite
|
Sign up to set email alerts
|

Minimum complexity regression estimation with weakly dependent observations

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
69
0

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 28 publications
(71 citation statements)
references
References 22 publications
2
69
0
Order By: Relevance
“…This implies that E|W m,n | < ∞, hence we now have from Lemma A.6 of Modha & Masry (1996) and from (48) that…”
Section: Proofmentioning
confidence: 84%
See 4 more Smart Citations
“…This implies that E|W m,n | < ∞, hence we now have from Lemma A.6 of Modha & Masry (1996) and from (48) that…”
Section: Proofmentioning
confidence: 84%
“…Implicitly, while establishing the rates of convergence results in Theorem 2.1 and Corollary 2.1, we assumed that these least-squares estimators can indeed be computed. Such an assumption is the very basis for applying Vapnik's empirical risk minimization theory to neural networks, and has been widely used in the literature dealing with rates of convergence results for neural networks and other models, see, for example, Barron (1994), Barron, Birgé, & Massart (1996), Breiman (1993), Haussler (1992), Kearns (1997), Lugosi & Nobel (1995), Lugosi & Zeger (1996), McCaffrey & Gallant (1994), Modha & Masry (1996, 1998, Vapnik (1982Vapnik ( , 1995, and White (1989).…”
Section: Remark 23 (Computational Complexity Of Nonlinear Least-squamentioning
confidence: 99%
See 3 more Smart Citations