Nonlinear techniques have found an increasing interest in the dynamical analysis of various kinds of systems. Among these techniques, entropy-based metrics have emerged as practical alternatives to classical techniques due to their wide applicability in different scenarios, specially to short and noisy processes. Issued from information theory, entropy approaches are of great interest to evaluate the degree of irregularity and complexity of physical, physiological, social, and econometric systems. Based on Shannon entropy and conditional entropy (CE), various techniques have been proposed; among them, approximate entropy, sample entropy, fuzzy entropy, distribution entropy, permutation entropy, and dispersion entropy are probably the most well-known. After a presentation of the basic information-theoretic functionals, these measures are detailed, together with recent proposals inspired by nearest neighbors and parametric approaches. Moreover, the role of dimension, data length, and parameters in using these measures