In this paper, we introduced the infinite continuous mixture of Dirichlet distributions as a generalization of the infinite mixture of Dirichlet ones, in order to avoid the limitation of choosing the a priori sample size for the expectation a posteriori estimator. Monte-Carlo sampling was used in order to obtain the posterior distributions mixture, since this mixture is difficult to get analytically. A new parametrization of this proposed distribution was achieved. Then, we suggested a mixture expectation a posteriori estimator of the unknown parameters. The proposed estimator solves the problem of how to construct a Bayesian estimation of proportions without specifying particular parameters and sample size of the prior knowledge. Some asymptotic properties of this estimator were derived, specifically, its bias and variance. The consistency and asymptotic normality of the estimator were also established when the sample size tends to infinity and its credible interval was determined. The performance of the proposed estimator was illustrated theoretically and by means of a simulation study. Ultimately, a comparative simulation study between the learned estimates, the proposed mixture expectation a posteriori, standard Bayesian estimator, maximum likelihood and Jeffreys estimator, was established. According to this simulation, we were able to conclude that the prior infinite mixture of Dirichlet distributions offers higher accuracy and flexibility for modeling and learning data.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.