The q-Gaussian distribution is known to be an attractor of certain correlated systems, and is the distribution which, under appropriate constraints, maximizes the entropy S q , the basis of nonextensive statistical mechanics. This theory is postulated as a natural extension of the standard (Boltzmann-Gibbs) statistical mechanics, and may explain the ubiquitous appearance of heavy-tailed distributions in both natural and man-made systems. The q-Gaussian distribution is also used as a numerical tool, for example as a visiting distribution in Generalized Simulated Annealing. We develop and present a simple, easy to implement numerical method for generating random deviates from a q-Gaussian distribution based upon a generalization of the well known Box-Müller method. Our method is suitable for a larger range of q values, 3 q −∞ < < , than has previously appeared in the literature, and can generate deviates from q-Gaussian distributions of arbitrary width and center. MATLAB code showing a straightforward implementation is also included.
Abstract:We study a strictly scale-invariant probabilistic N-body model with symmetric, uniform, identically distributed random variables. Correlations are induced through a transformation of a multivariate Gaussian distribution with covariance matrix decaying out from the unit diagonal, as ρ/r α for r =1, 2, …, N-1, where r indicates displacement from the diagonal and where 0 ρ 1 and α 0. We show numerically that the sum of the N dependent random variables is well modeled by a compact support q-Gaussian distribution. In the particular case of α = 0 we obtain q = (1-5/3 ρ) / (1-ρ), a result validated analytically in a recent paper by Hilhorst and Schehr. Our present results with these q-Gaussian approximants precisely mimic the behavior expected in the frame of non-extensive statistical mechanics. The fact that the N → ∞ limiting distributions are not exactly, but only approximately, q-Gaussians suggests that the present system is not exactly, but only approximately, q-independent in the sense of the q-generalized central limit theorem of Umarov, Steinberg and Tsallis. Short range interaction (α > 1) and long range interactions (α < 1) are discussed. Fitted parameters are obtained via a Method of Moments approach. Simple mechanisms which lead to the production of q-Gaussians, such as mixing, are discussed.
-By considering a nonlinear combination of the probabilities of a system, a physical interpretation of Tsallis statistics as representing the nonlinear coupling or decoupling of statistical states is proposed. The escort probability is interpreted as the coupled probability, with Gaussian distributions. This conjugate relationship has been used to extend the generalized Fourier transform to the compact-support domain and to define a scale-invariant correlation structure with heavy-tail limit distribution. In the present paper, we show that the conjugate is a mapping between the source of nonlinearity in non-stationary stochastic processes and the nonlinear coupling which defines the coupled-Gaussian limit distribution. The effects of additive and multiplicative noise are shown to be separable into the coupled-variance and the coupling parameter Q, providing further evidence of the importance of the generalized moments.
The increased uncertainty and complexity of nonlinear systems have motivated investigators to consider generalized approaches to defining an entropy function. New insights are achieved by defining the average uncertainty in the probability domain as a transformation of entropy functions. The Shannon entropy when transformed to the probability domain is the weighted geometric mean of the probabilities. For the exponential and Gaussian distributions, we show that the weighted geometric mean of the distribution is equal to the density of the distribution at the location plus the scale (i.e. at the width of the distribution). The average uncertainty is generalized via the weighted generalized mean, in which the moment is a function of the nonlinear source. Both the Rényi and Tsallis entropies transform to this definition of the generalized average uncertainty in the probability domain. For the generalized Pareto and Student's t--distributions, which are the maximum entropy distributions for these generalized entropies, the appropriate weighted generalized mean also equals the density of the distribution at the location plus scale. A coupled entropy function is proposed, closely related to the normalized Tsallis entropy, but incorporating a distinction between the additive coupling and multiplicative coupling.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.