Entropy is originally a concept from thermodynamics that distinguishes between useful and useless energy 1 . This led to the Second Law of Thermodynamics, which states that entropy always increases. The Hamiltonian principle states that the differential equation for the Second Law is equivalent to the integral equation for Least Action.In the information age, entropy has been found to be related to complexity. There are two main approaches, one of which is based on C. Shannon (1916Shannon ( -2001 and the other on A. N. Kolmogorov (1903N. Kolmogorov ( -1987. The former is communication theory, while the latter is the basis of Algorithmic Information Theory (AIT), which studies the shortest algorithm for encoding a message that yields the "best possible compression". This is of course also usage-based and refers to 'optimal' methods of data processing. The commonly used example is a particular sequence of binary digits, symbols or characters. Each of these sequences is called a message. Two sequences (or strings) are:1 "Useful energy" is somewhat anthropocentric, in the sense of "usefulness for humans".Our scientific and technological worldviews are largely dominated by the concepts of entropy and complexity. Originating in 19th-century thermodynamics, the concept of entropy merged with information in the last century, leading to definitions of entropy and complexity by Kolmogorov, Shannon and others. In its simplest form, this worldview is an application of the normal rules of arithmetic. In this worldview, when tossing a coin, a million heads or tails in a row is theoretically possible, but impossible in practice and in real life. On this basis, the impossible (in the binary case, the outermost entries of Pascal's triangle 𝑥𝑥𝑥𝑥 𝑛𝑛𝑛𝑛 and 𝑦𝑦𝑦𝑦 𝑛𝑛𝑛𝑛 for large values of 𝑛𝑛𝑛𝑛) can be safely neglected, and one can concentrate fully on what is common and what conforms to the law of large numbers, in fields ranging from physics to sociology and everything in between. However, in recent decades it has been shown that what is most improbable tends to be the rule in nature. Indeed, if one combines the outermost entries 𝑥𝑥𝑥𝑥 𝑛𝑛𝑛𝑛 and 𝑦𝑦𝑦𝑦 𝑛𝑛𝑛𝑛 with the normal rules of arithmetic, either addition or multiplication, one obtains Lamé curves and power laws respectively. In this article, some of these correspondences are highlighted, leading to a double conclusion. First, Gabriel Lamé's geometric footprint in mathematics and the sciences is enormous. Second, conic sections are at the core once more. Whereas mathematics so far has been exclusively the language of patterns in the sciences, the door is opened for mathematics to also become the language of the individual. The probabilistic worldview and Lamé's footprint can be seen as dual methods. In this context, it is to be expected that the notions of information, complexity, simplicity and redundancy benefit from this different viewpoint.