a b s t r a c tWe introduce estimation and test procedures through divergence optimization for discrete or continuous parametric models. This approach is based on a new dual representation for divergences. We treat point estimation and tests for simple and composite hypotheses, extending the maximum likelihood technique. Another view of the maximum likelihood approach, for estimation and tests, is given. We prove existence and consistency of the proposed estimates. The limit laws of the estimates and test statistics (including the generalized likelihood ratio one) are given under both the null and the alternative hypotheses, and approximations of the power functions are deduced. A new procedure of construction of confidence regions, when the parameter may be a boundary value of the parameter space, is proposed. Also, a solution to the irregularity problem of the generalized likelihood ratio test pertaining to the number of components in a mixture is given, and a new test is proposed, based on χ 2 -divergence on signed finite measures and the duality technique.
Abstract. We consider the minimization problem of φ-divergences between a given probability measure P and subsets Ω of the vector space M F of all signed finite measures which integrate a given class F of bounded or unbounded measurable functions. The vector space M F is endowed with the weak topology induced by the class F ∪ B b where B b is the class of all bounded measurable functions. We treat the problems of existence and characterization of the φ-projections of P on Ω. We consider also the dual equality and the dual attainment problems when Ω is defined by linear constraints.
The aim of this paper is to introduce new statistical criterions for estimation, suitable for inference in models with common continuous support. This proposal is in the direct line of a renewed interest for divergence based inference tools imbedding the most classical ones, such as maximum likelihood, Chi-square or Kullback Leibler. General pseudodistances with decomposable structure are considered, they allowing to define minimum pseudodistance estimators, without using nonparametric density estimators. A special class of pseudodistances indexed by α > 0, leading for α ↓ 0 to the Kulback Leibler divergence, is presented in detail. Corresponding estimation criteria are developed and asymptotic properties are studied. The estimation method is then extended to regression models. Finally, some examples based on Monte Carlo simulations are discussed.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.