One of the crucial properties of the Boltzmann-Gibbs entropy in the context of classical thermodynamics is extensivity, namely proportionality with the number of elements of the system. The Boltzmann-Gibbs entropy satisfies this prescription if the subsystems are statistically (quasi-) independent, or typically if the correlations within the system are essentially local. In such cases the energy of the system is typically extensive and the entropy is additive. In general, however, the situation is not of this type and correlations may be far from negligible at all scales. Tsallis in 1988 introduced an entropic expression characterized by an index q which leads to a non-extensive statistics. Tsallis entropy, Sq, is the basis of the so called non-extensive statistical mechanics, which generalizes the Boltzmann-Gibbs theory. Tsallis statistics have found applications in a wide range of phenomena in diverse disciplines such as physics, chemistry, biology, medicine, economics, geophysics, etc. The focus of this special issue of Entropy was to solicit contributions that apply Tsallis entropy in various scientific fields.This special issue consists of nine regular papers, covering various aspects and applications of Tsallis non-additive entropy, and an invited review paper written by Tsallis [1]. In this review, the following aspects of Tsallis entropy are discussed: (i) Additivity versus extensivity; (ii) Probability distributions that constitute attractors in the sense of Central Limit Theorems; (iii) The analysis of paradigmatic low-dimensional nonlinear dynamical systems near the edge of chaos; and (iv) The analysis of paradigmatic long-range-interacting many-body classical Hamiltonian systems. Finally, recent as well as typical predictions, verifications and applications of these concepts in natural, artificial, and social systems, as shown through theoretical, experimental, observational and computational results are presented.In their paper, Zhang and Wu [2] propose a global multi-level thresholding method for image segmentation by applying the Tsallis entropy, as a general information theory entropy formalism, and using an artificial bee colony algorithm. They demonstrate that Tsallis entropy is superior to traditional maximum entropy thresholding, based on Shannon entropy, and that the artificial bee colony is more rapid than either genetic algorithm or particle swarm optimization. Vila, Bardera, Feixas and Sbert [3] investigate the application of three different Tsallis-based generalizations of mutual information to OPEN ACCESS