2018
DOI: 10.1002/sta4.186
|View full text |Cite
|
Sign up to set email alerts
|

Maximum regularized likelihood estimators: A general prediction theory and applications

Abstract: Maximum regularized likelihood estimators (MRLEs) are arguably the most established class of estimators in highdimensional statistics. In this paper, we derive guarantees for MRLEs in the Kullback-Leibler divergence, a general measure of prediction accuracy. We assume only that the densities have a convex parametrization and that the regularization is definite and positive homogenous. The results thus apply to a very large variety of models and estimators, such as tensor regression and graphical models with co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 9 publications
(5 citation statements)
references
References 55 publications
0
5
0
Order By: Relevance
“…The prediction consistency theory has been well-established for non-parametric and high-dimensional statistics; see Zhuang and Lederer (2018) for the recent development of general regularized maximum likelihood estimators. However, their works mainly aim for non-parametric or high-dimensional models and do not cover the semi-parametric case as studied in our paper.…”
Section: Resultsmentioning
confidence: 99%
“…The prediction consistency theory has been well-established for non-parametric and high-dimensional statistics; see Zhuang and Lederer (2018) for the recent development of general regularized maximum likelihood estimators. However, their works mainly aim for non-parametric or high-dimensional models and do not cover the semi-parametric case as studied in our paper.…”
Section: Resultsmentioning
confidence: 99%
“…The prediction consistency theory has been well-established for the estimators in non-parametric statistics and high-dimensional statistics, see Zhuang and Lederer (2018) for the recent development for general regularized maximum likelihood estimators. However their works specially aim for nonparametric or high-dimensional models, they do not cover the semi-parametric case as studied in our paper.…”
Section: Resultsmentioning
confidence: 99%
“…on its off-diagonal. And while there is much theory on the properties of the graphical lasso (Rothman et al, 2008;Ravikumar et al, 2008;Jankova and van de Geer, 2018), as well as general theory for graphical models (Zhuang and Lederer, 2018), there is no theory that includes the practical choice of r.…”
Section: Brief Review Of Gaussian Graphical Modelsmentioning
confidence: 99%