In various environmental contexts, estimating the distribution of unobserved random vectors Xi from some noisy indirect observations H(Xi)+Ui is required. If the relation between Xi and the quantity H(Xi), measured with the error Ui, is implemented by a CPU‐consuming computer model H, a major practical difficulty is to perform the statistical inference with a relatively small number of runs of H. A Bayesian statistical framework is considered to make use of possible prior knowledge on the parameters of the distribution of the Xi, which is assumed Gaussian. Moreover, a Markov Chain Monte Carlo algorithm is carried out to estimate their posterior distribution by replacing H by a kriging metamodel build from a limited number of simulated experiments. Two heuristics, involving two different criteria to be optimized, are proposed to sequentially design these computer experiments in the limits of a given computational budget. The first criterion is a weighted integrated mean square error. The second one, called expected conditional divergence, developed in the spirit of the stepwise uncertainty reduction criterion is based on the discrepancy between two consecutive approximations of the target posterior distribution. Several numerical comparisons conducted over a toy example then a motivating hydraulic real case study show that such adaptive designs can significantly outperform the classical choice of a maximin latin hypercube design of experiments. Dealing with a major concern in hydraulic engineering, a particular emphasis is placed upon the prior elicitation of the case study, highlighting the overall feasibility of the methodology. Faster convergences and manageability considerations lead to recommend the use of the expected conditional divergence criterion in practical applications.