Summary
Any knowledge extraction relies (possibly implicitly) on a hypothesis about the modelled‐data dependence. The extracted knowledge ultimately serves to a decision‐making (DM). DM always faces uncertainty and this makes probabilistic modelling adequate. The inspected black‐box modeling deals with “universal” approximators of the relevant probabilistic model. Finite mixtures with components in the exponential family are often exploited. Their attractiveness stems from their flexibility, the cluster interpretability of components and the existence of algorithms for processing high‐dimensional data streams. They are even used in dynamic cases with mutually dependent data records while regression and auto‐regression mixture components serve to the dependence modeling. These dynamic models, however, mostly assume data‐independent component weights, that is, memoryless transitions between dynamic mixture components. Such mixtures are not universal approximators of dynamic probabilistic models. Formally, this follows from the fact that the set of finite probabilistic mixtures is not closed with respect to the conditioning, which is the key estimation and predictive operation. The paper overcomes this drawback by using ratios of finite mixtures as universally approximating dynamic parametric models. The paper motivates them, elaborates their approximate Bayesian recursive estimation and reveals their application potential.