The posts in this series are primarily about the second law of thermodynamics, the concept of entropy, and the use and abuse of these ideas. I would be remiss, however, not to mention information theory and the role that entropy plays. This post is not intended to be a comprehensive introduction to information theory. Readers especially interested in this topic will want to read other sources in addition to this post.
Information Theory
Information theory is concerned with modeling communication systems. These communication systems could be understood, for example, to be radio frequency communication, but the results of the theory are more general than a specific system of encoding information and some of the consequences of the theory can be broad in scope and significance.
Ash states:
It is possible to make a case for the statement that information theory is essentially the study of one theorem, ... which states that "it is possible to transmit information through a noisy channel at any rate less than channel capacity with an arbitrarily small probability of error."Information and Uncertainty
Information can be understood as the reduction of uncertainty. If I say that I have a message, I have not provided much information about the content of the message.
On the other hand, if I say that the the message contains twelve words in the English language, I have reduced the uncertainty of the message content and provided information. If I say the first word is "the," I have further reduced the uncertainty, again conveying information.
Bits
It is common to reduce information to a simple form: from ones and zeros. A bit can be either one or zero, and by looking at a fundamental bit of information, it is possible to understand the mathematical laws that govern information.
As human beings, we may rebel against the notion that information can be conveyed as ones and zeros. Computers are based on binary systems and we think of ourselves as being complex beings whose messages cannot be reduced to ones and zeros.
Consider, however, that the character set that I am using to write this blog is, in fact, coded in ones and zeros. All of the great poetry of the world, such as the works of Shakespeare, Whitman, Homer, Sophocles, and Baudelaire can be encoded in ones and zeros. If the ones and zeros are not transmitted correctly, some of the information may be lost.
-P log P
Suppose we want to convey one of sixteen possible messages that are equally probable. It turns out that this can be done with 4 bits of data. Each bit can have two possible values and 2 x 2 x 2 x 2 = 16.
The probability of any message is 1/16, and it requires -log(1/16) = 4 bits of information to convey, if the log is taken to be log base two for convenience.
Here we have assumed that that messages are equally probably, but if they are not, they have to be weighted by their probability.
The uncertainty of the information is the sum over all messages of the quantity - P log P. This uncertainty is also called the entropy by analogy with thermodynamics.
S = - ΣP log P
Thermodynamic Entropy
Recall that an expression for the thermodynamic entropy is:
S = -k ΣP*ln(P)
If we define the redefine the thermodynamic entropy as S/k, recalling that k is merely a constant we can make the equations the same and eliminate the units. The difference between the log used is inconsequential as we can multiply by a constant and redefine the entropy appropriately.
One should be careful in taking the analogy too far. Remember that in thermodynamics that the numbers involved are extremely large. In most cases, information theory does not involve such large numbers.
Information theory can become quite complicated, and this discussion is only intended to be a cursory mention of the topic, not a definitive comparison.
The next post discusses the Second Law and Creationism.
Sources
- Ash, Robert B., Information Theory, Dover Publications Inc., New York, 1965
- Touretzky, David S. Basics of Information Theory, http://http://www.cs.cmu.edu/~dst/Tutorials/Info-Theory
Contents
- Introduction
- What the Second Law Does Not Say
- What the Second Law Does Say
- Entropy is Not a Measure of Disorder
- Reversible Processes
- The Carnot Cycle
- The Definition of Entropy
- Perpetual Motion
- The Hydrogen Economy
- Heat Can Be Transferred From a Cold Body to a Hot Body: The Air Conditioner
- The Second Law and Swamp Coolers
- Entropy and Statistical Thermodynamics
- Fluctuations
- Partition Functions
- Entropy and Information Theory
- The Second Law and Creationism
- Entropy as Religious, Spiritual, or Self-Help Metaphor
- Free Energy
- Spontaneous Change and Equilibrium
- The Second Law, Radiative Transfer, and Global Warming
- The Second Law, Microscopic Reversibility, and Small Systems
- The Arrow of Time
- The Heat Death of the Universe
- Gravity and Entropy
- The Second Law and Nietzsche's Eternal Recurrence
- Conclusion
No comments:
Post a Comment