Abstract-A brief chronicle is given of the historical development of the central problems in the theory of fundamental limits of data compression and reliable communication.Index Terms-Channel capacity, data compression, entropy, history of Information Theory, reliable communication, source coding. C LAUDE Shannon's "A mathematical theory of communication" [1] published in July and October of 1948 is the Magna Carta of the information age. Shannon's discovery of the fundamental laws of data compression and transmission marks the birth of Information Theory. A unifying theory with profound intersections with Probability, Statistics, Computer Science, and other fields, Information Theory continues to set the stage for the development of communications, data storage and processing, and other information technologies.This overview paper gives a brief tour of some of the main achievements in Information Theory. It confines itself to those disciplines directly spawned from [1]-now commonly referred to as Shannon theory.Section I frames the revolutionary nature of "A mathematical theory of communication," in the context of the rudimentary understanding of the central problems of communication theory available at the time of its publication.Section II is devoted to lossless data compression: the amount of information present in a source and the algorithms developed to achieve the optimal compression efficiency predicted by the theory.Section III considers channel capacity: the rate at which reliable information can be transmitted through a noisy channel.Section IV gives an overview of lossy data compression: the fundamental tradeoff of information rate and reproduction fidelity.The paper concludes with a list of selected points of tangency of Information Theory with other fields. Publisher Item Identifier S 0018-9448(98)06315-9.• Frequency Modulation (Armstrong, 1936);• Pulse-Code Modulation (PCM) (Reeves, 1937(Reeves, -1939;• Vocoder (Dudley, 1939); • Spread Spectrum (1940's). In those systems we find some of the ingredients that would be key to the inception of information theory: a) the Morse code gave an efficient way to encode information taking into account the frequency of the symbols to be encoded; b) systems such as FM, PCM, and spread spectrum illustrated that transmitted bandwidth is just another degree of freedom available to the engineer in the quest for more reliable communication; c) PCM was the first digital communication system used to transmit analog continuous-time signals; d) at the expense of reduced fidelity, the bandwidth used by the Vocoder [2] was less than the message bandwidth.In 1924, H. Nyquist [3] argued that the transmission rate is proportional to the logarithm of the number of signal levels in a unit duration. Furthermore, he posed the question of how much improvement in telegraphy transmission rate could be achieved by replacing the Morse code by an "optimum" code. R. Hartley's 1928 paper [10] uses terms such as "rate of communication," "intersymbol interference," and "capacity of a s...