The constrained minimization (respectively maximization) of directed distances and of related generalized entropies is a fundamental task in information theory as well as in the adjacent fields of statistics, machine learning, artificial intelligence, signal processing and pattern recognition. In our previous paper "A precise bare simulation approach to the minimization of some distances. I. Foundations", we obtained such kind of constrained optima by a new dimension-free precise bare (pure) simulation method, provided basically that (i) the underlying directed distance is of f −divergence type, and that (ii) this can be connected to a light-tailed probability distribution in a certain manner. In the present paper, we extend this approach such that constrained optimization problems of a very huge amount of directed distances and generalized entropies -and beyond -can be tackled by a newly developed dimension-free extended bare simulation method, for obtaining both optima as well as optimizers. Almost no assumptions (like convexity) on the set of constraints are needed, within our discrete setup of arbitrary dimension, and our method is precise (i.e., converges in the limit). For instance, we cover constrained optimizations of arbitrary f −divergences, Bregman distances, scaled Bregman distances and weighted ℓr−distances. The potential for wide-spread applicability is indicated, too; in particular, we deliver many recent references for uses of the involved distances/divergences in various different research fields (which may also serve as an interdisciplinary interface).