This is the accepted version of the paper.This version of the publication may differ from the final published version.
Permanent repository link
AbstractA review is presented of the relation between information and entropy, focusing on two main issues: the similarity of the formal definitions of physical entropy, according to statistical mechanics, and of information, according to information theory; and the possible subjectivity of entropy considered as missing information. The paper updates the 1983 analysis of Shaw and Davis. The difference in the interpretations of information given respectively by Shannon and by Wiener, significant for the information sciences, receives particular consideration.Analysis of a range of material, from literary theory to thermodynamics, is used to draw out the issues. Emphasis is placed on recourse to the original sources, and on direct quotation, to attempt to overcome some of the misunderstandings and oversimplifications which have occurred with these topics.While it is strongly related to entropy, information is neither identical with it, nor its opposite. Information is related to order and pattern, but also to disorder and randomness. The relations between information and the 'interesting complexity', which embodies both pattern and randomness, are worthy of attention."A few exciting words": information and entropy revisited THOMASINA: When you stir your rice pudding, Septimus, the spoonful of jam spreads itself round making red trails like the picture of the meteor in my astronomical atlas. But if need stir backward, the jam will not come together again. Indeed, the pudding does not notice and continues to turn pink just as before. Do you think this odd? SEPTIMUS: No THOMASINA: Well, I do. You cannot stir things apart. SEPTIMUS: No more you can, time must needs run backward, and since it will not, we must stir our way onward mixing as we go, disorder out of disorder into disorder until pink is complete, unchanging and unchangeable, and we are done with it forever. This is known as free will or self-determination. Tom Stoppard, Arcadia, (Act 1, Scene 1), London: Faber and Faber, 1993 If information is pattern, the non-information should be the absence of pattern, that is, randomness. This commonsense expectation ran into unexpected complications when certain developments within information theory implied that information could be equated with randomness as well as with pattern. Identifying information with both pattern and randomness proved to be a powerful paradox (Hayles 1999, p. 25)
IntroductionThis paper re-examines the relations between information and entropy, a topic of debate for nearly a century. It can be seen as updating, thirty years on, of the analysis of Shaw and Davis (1983), presenting new findings and new perspectives. In order to make proper sense of these, it is necessary to delve into the historical development of the issues. Where possible, we have used quotations from original sources, as this is an area where there has been much misunderstanding ...