We propose a unifying picture where the notion of generalized entropy is related to information theory by means of a group-theoretical approach. The group structure comes from the requirement that an entropy be well defined with respect to the composition of independent systems, in the context of a recently proposed generalization of the Shannon-Khinchin axioms. We associate to each member of a large class of entropies a generalized information measure, satisfying the additivity property on a set of independent systems as a consequence of the underlying group law. At the same time, we also show that Einstein's likelihood function naturally emerges as a byproduct of our informational interpretation of (generally nonadditive) entropies. These results confirm the adequacy of composable entropies both in physical and social science contexts. DOI: 10.1103/PhysRevE.93.040101 The study of the relations among statistical mechanics, information theory and the notion of entropy is at the heart of the science of complexity, and in the last decades has been widely explored. After the seminal works of Shannon and Tsallis [7], has been the prototype of the nonadditive entropies studied in the last decades [8][9][10][11][12][13]. These functionals are generalizations of the BG entropy; they depend on one or more parameters, in such a way that the BG entropy is recovered as a particular limit. Generalized entropies have been successfully adopted for the study of both classical and quantum systems. Rényi's entropy, for example, plays a central role in information theory and in the study of multifractality [14]; along with the von Neumann entropy, it has been also used extensively in the evaluation of the entanglement entropy of quantum systems [15][16][17][18].The study of new entropic forms has led to a new flow of ideas regarding the old problem of the probabilistic versus the dynamical foundations of the notion of entropy. It is well known [19] that Einstein's approach was very different with respect to the probabilistic methodology of Boltzmann (which eventually emerged as the predominant one). Indeed, Einstein argued that the probabilities of occupation of the various regions of the phase space associated with a physical system cannot be postulated a priori. Instead, only a knowledge of * sicuro@cbpf.br † p.tempesta@fis.ucm.es dynamics, obtained by solving the equations of motion, could provide this information. For this reason, in his theory of fluctuations, Einstein [20] introduced the likelihood function W ∝ exp (S BG ) as a fundamental statistical quantity (for the sake of simplicity, here and in the following, we put k B ≡ 1, k B being the Boltzmann constant). He observed that by composing two independent systems A and B, the fundamental relationholds. Equation (1) is epistemologically crucial: it expresses the fact that the physical description of system A does not depend on the physical description of system B, and vice versa. Moreover, it is related to the additivity requirement of the information content of ind...