It is shown how to perform some steps of perturbation theory if one assumes a measure-theoretic point of view, i.e. if one renounces to control the evolution of the single trajectories, and the attention is restricted to controlling the evolution of the measure of some meaningful subsets of phasespace. For a system of coupled rotators, estimates uniform in N for finite specific energy can be obtained in quite a direct way. This is achieved by making reference not to the sup norm, but rather, following Koopman and von Neumann, to the much weaker L 2 norm.Running title: An averaging theorem in the thermodynamic limit
In this paper, we construct an adiabatic invariant for a large 1-d lattice of particles, which is the so called Klein Gordon lattice. The time evolution of such a quantity is bounded by a stretched exponential as the perturbation parameters tend to zero. At variance with the results available in the literature, our result holds uniformly in the thermodynamic limit. The proof consists of two steps: first, one uses techniques of Hamiltonian perturbation theory to construct a formal adiabatic invariant; second, one uses probabilistic methods to show that, with large probability, the adiabatic invariant is approximately constant. As a corollary, we can give a bound from below to the relaxation time for the considered system, through estimates on the autocorrelation of the adiabatic invariant.
Consider an FPU chain composed of N ≫ 1 particles, and endow the phase space with the Gibbs measure corresponding to a small temperature β −1 . Given a fixed K < N , we construct K packets of normal modes whose energies are adiabatic invariants (i.e., are approximately constant for times of order β 1−a , a > 0) for initial data in a set of large measure. Furthermore, the time autocorrelation function of the energy of each packet does not decay significantly for times of order β. The restrictions on the shape of the packets are very mild. All estimates are uniform in the number N of particles and thus hold in the thermodynamic limit N → ∞, β > 0.
For a dynamical system far from equilibrium, one has to deal with empirical probabilities defined through time-averages, and the main problem is then how to formulate an appropriate statistical thermodynamics. The common answer is that the standard functional expression of Boltzmann-Gibbs for the entropy should be used, the empirical probabilities being substituted for the Gibbs measure. Other functional expressions have been suggested, but apparently with no clear mechanical foundation. Here it is shown how a natural extension of the original procedure employed by Gibbs and Khinchin in defining entropy, with the only proviso of using the empirical probabilities, leads for the entropy to a functional expression which is in general different from that of Boltzmann-Gibbs. In particular, the Gibbs entropy is recovered for empirical probabilities of Poisson type, while the Tsallis entropies are recovered for a deformation of the Poisson distribution.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.