2011
DOI: 10.1214/10-aos853
|View full text |Cite
|
Sign up to set email alerts
|

Approximation by log-concave distributions, with applications to regression

Abstract: We study the approximation of arbitrary distributions P on d-dimensional space by distributions with log-concave density. Approximation means minimizing a Kullback-Leibler-type functional. We show that such an approximation exists if and only if P has finite first moments and is not supported by some hyperplane. Furthermore we show that this approximation depends continuously on P with respect to Mallows distance D1(·, ·). This result implies consistency of the maximum likelihood estimator of a log-concave den… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
153
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 93 publications
(155 citation statements)
references
References 23 publications
2
153
0
Order By: Relevance
“…For log-concave densities on ℝ it has been explored in more detail by Dümbgen and Rufibach [2009], Balabdaoui, Rufibach and Wellner [2009], and recent results for estimation of log-concave densities on ℝ d have been obtained by Cule and Samworth [2010], Cule, Samworth and Stewart [2010], Dümbgen, Samworth and Schuhmacher [2011]. Cule, Samworth and Stewart [2010] formulate the problem of computing the maximum likelihood estimator of a multidimensional log-concave density as a non-differentiable convex optimization problem and propose an algorithm that combines techniques of computational geometry with Shor's r-algorithm to produce a sequence that converges to the estimator.…”
Section: Some Open Problems and Further Connections With Log-concamentioning
confidence: 99%
“…For log-concave densities on ℝ it has been explored in more detail by Dümbgen and Rufibach [2009], Balabdaoui, Rufibach and Wellner [2009], and recent results for estimation of log-concave densities on ℝ d have been obtained by Cule and Samworth [2010], Cule, Samworth and Stewart [2010], Dümbgen, Samworth and Schuhmacher [2011]. Cule, Samworth and Stewart [2010] formulate the problem of computing the maximum likelihood estimator of a multidimensional log-concave density as a non-differentiable convex optimization problem and propose an algorithm that combines techniques of computational geometry with Shor's r-algorithm to produce a sequence that converges to the estimator.…”
Section: Some Open Problems and Further Connections With Log-concamentioning
confidence: 99%
“…Specifically, the epi-splines computed from {(P n p,m )} ∞ n=1 tend to a point in the Kullback-Leibler projection, relative to the soft information constraint set, of the true density on the class of epi-splines under consideration. We refer to [24,14,44] for related results on model misspecfication. The second part shows that if the true density is not excluded by the soft information, then {(P n p,m )} ∞ n=1 eventually yields the true density, or possibly a closely related one that deviates at most on m.…”
Section: Theorem (Consistency)mentioning
confidence: 99%
“…Log-concave densities have many favorable properties as described by Balabdaoui et al (2009). To estimate the log-density ϕ ( x ), Dümbgen et al (2011) proposed an estimator by maximizing a log-likelihood-type functional: L(ϕ,Q)=ϕdQ-exp{ϕ(x)}dx+1, where Q ∈ 𝒬 , 𝒬 is the family of all d -dimensional distributions, ϕ ∈ Φ and Φ is the family of all concave function. For linear regression with log-concave error density, Dümbgen et al (2011) proposed an estimator by maximizing: L^(ϕ,β,Q)=1ni=1nϕfalse(yi-xiTbold-italicβfalse)-exp{ϕ(x)}dx+1.…”
Section: Introductionmentioning
confidence: 99%