Abstract:Estimators derived from a divergence criterion such as ϕ−divergences are generally more robust than the maximum likelihood ones. We are interested in particular in the so-called minimum dual ϕ-divergence estimator (MDϕDE), an estimator built using a dual representation of ϕ-divergences. We present in this paper an iterative proximal point algorithm that permits the calculation of such an estimator. The algorithm contains by construction the well-known Expectation Maximization (EM) algorithm. Our work is based on the paper of Tseng on the likelihood function. We provide some convergence properties by adapting the ideas of Tseng. We improve Tseng's results by relaxing the identifiability condition on the proximal term, a condition which is not verified for most mixture models and is hard to be verified for "non mixture" ones. Convergence of the EM algorithm in a two-component Gaussian mixture is discussed in the spirit of our approach. Several experimental results on mixture models are provided to confirm the validity of the approach.