“…First, Shannon's information theory justified log 2 (1/p) as a code length for items with probability p. This is helpful for providing code lengths of highly repetitive data patterns, which can be assigned probabilities, such as low-level perceptual properties, phonemes, words and so on [2]. Second, the critical generalization to algorithmic information theory by Kolmogorov, Solomonoff and Chaitin defined the complexity KðxÞ of any object, x, by the length of the shortest program for x in any standard (universal) computer programming language [3].…”