2009
DOI: 10.1016/j.jmva.2008.03.011
|View full text |Cite
|
Sign up to set email alerts
|

Parametric estimation and tests through divergences and the duality technique

Abstract: a b s t r a c tWe introduce estimation and test procedures through divergence optimization for discrete or continuous parametric models. This approach is based on a new dual representation for divergences. We treat point estimation and tests for simple and composite hypotheses, extending the maximum likelihood technique. Another view of the maximum likelihood approach, for estimation and tests, is given. We prove existence and consistency of the proposed estimates. The limit laws of the estimates and test stat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
107
0

Year Published

2014
2014
2024
2024

Publication Types

Select...
3
3
2

Relationship

3
5

Authors

Journals

citations
Cited by 79 publications
(108 citation statements)
references
References 24 publications
1
107
0
Order By: Relevance
“…For example, the choice of relative entropy has been investigated by DiCiccio and Romano [16], Jing and Wood [27] and led to "Entropy econometrics" in the econometric field (see Golan et al [22]). Related results may be found in the probabilistic literature about divergence or the method of entropy in mean (see Broniatowski and Kéziou [11], Csiszár [14], Gamboa and Gassiat [21], Léonard [29], Liese and Vajda [30]). Some generalizations of the empirical likelihood method have also been obtained by using Cressie-Read discrepancies (see Baggerly [2], Corcoran [12]) and led to some econometric extensions known as "generalized empirical likelihood" (see Newey and Smith [33]), even if the "likelihood" properties and in particular the Bartlett-correctability in these cases are lost (see Jing and Wood [27]).…”
Section: Introductionmentioning
confidence: 83%
See 1 more Smart Citation
“…For example, the choice of relative entropy has been investigated by DiCiccio and Romano [16], Jing and Wood [27] and led to "Entropy econometrics" in the econometric field (see Golan et al [22]). Related results may be found in the probabilistic literature about divergence or the method of entropy in mean (see Broniatowski and Kéziou [11], Csiszár [14], Gamboa and Gassiat [21], Léonard [29], Liese and Vajda [30]). Some generalizations of the empirical likelihood method have also been obtained by using Cressie-Read discrepancies (see Baggerly [2], Corcoran [12]) and led to some econometric extensions known as "generalized empirical likelihood" (see Newey and Smith [33]), even if the "likelihood" properties and in particular the Bartlett-correctability in these cases are lost (see Jing and Wood [27]).…”
Section: Introductionmentioning
confidence: 83%
“…For general ϕ * -discrepancies, the following duality representation is a consequence of Borwein and Lewis [10] on convex functional integrals (see also Broniatowski and Kéziou [11], Léonard [29]). We denote λ the transposed vector of λ.…”
Section: Notation: ϕ * -Discrepancies and Convex Dualitymentioning
confidence: 99%
“…We have substituted the approximation of the variational problem Φ (Ω ) := inf (Φ (ω) , ω ∈ Ω ) by a much simpler one, namely a Monte Carlo one, defined by (3). Notice further that we do not need to identify the set of points ω in Ω which minimize Φ; indeed there may be no such points even.…”
Section: The Scope Of This Papermentioning
confidence: 99%
“…The modified Kullback-Leibler divergence (KL m -divergence) is sometimes called Burg relative entropy. It is frequently used in Statistics and it leads to efficient methods in statistical estimation and tests problems; in fact, the celebrate "maximum likelihood" method can be seen as an optimization problem of the KL m -divergence between the discrete or continuous parametric model and the empirical measure associated to the data; see [26] and [3]. On the other hand, the recent "empirical likelihood" method can also be seen as an optimization problem of the KL m -divergence between some set of measures satisfying some linear constraints and the empirical measure associated to the data; see [30] and the references therein, [18] and [4].…”
Section: Divergencesmentioning
confidence: 99%
“…Duality technique has been used by Broniatowski (2003) in order to estimate the Kullback-Leibler divergence without making use of any partitioning nor smoothing. It has been used also by Keziou (2003) and Broniatowski and Keziou (2003) in order to estimate φ-divergences between probability measures (without smoothing), and to introduce a new class of estimates and test statistics for discrete or continuous parametric models extending maximum likelihood approach; the use of the duality technique in the context of φ-divergences allows also to study the asymptotic properties of the test statistics (including the likelihood ratio one) both under the null and the alternative hypotheses. Recall that a φ-divergence between two probability measures Q and P , when Q is absolutely continuous with respect to P , is defined by…”
Section: Comparison Of Two Populationsmentioning
confidence: 99%