Topics

9/11 Acquisition Reform Advertising Alaway Alcohol Ale Allergies Antisemitism Barack H. Obama Beer Billiards Biology Books Budget Bureaucracy California Capitalism Carbohydrates Carcinogen CDC Chemical Warfare Chemistry Chemophobia Chirality Climate Science Colonial Pines Computers Conservation Laws Constitution Consumerism Cosmology CPT Invariance Creationism Customer Service Daesh David Irving Dead End Defense Dinosaurs Disasters Economic Energy English Ethics Evolution Fluoride Food FTL Garden Care George W. Bush Gerlich and Tscheuschner GISS Glaciers GMOs HadCRU Haiti Health Himalayan Rock Salt HITRAN Holocaust Denial Home Brewing How It Looks From Here html Humor Information Infrared Spectroscopy IPCC Iran ISIS Islam Islamophobia Israel Ketotifen Fumarate Law Lawn Care Leibniz Lisbon Magnetism Math Medco Medicine Modeling Molecules Monopoly Monsanto Naphazoline hydrochloride Neutrinos Nietzsche NIH NIST Noether's Theorem Non-hazardous Norton Ghost Nuclear Warfare Oil Oil Spill Olopatadine hydrochloride Opinion Orson Scott Card Parody Pataday Patanol Pesticides Pheneramine maleate Physics Plumbing Politics Poll Pope POTUS Prescriptions Prop 65 Psychology Quantum Mechanics Quiz Racism Radiative Transfer Relativity Religion Respiration Senior Housing Signs Smoking Specific Gravity Statistics Stock Market Sugars Sun Tzu Surface Temperature Surgeon General Symantec Target Temperature Terrorism The Final Solution The Holocaust History Project Thermodynamics Time Trains Units Voltaire von Clausewitz Weather White House Wine Yeast

Friday, March 4, 2011

Entropy and Information Theory

This post is part of a series,Nonsense and the Second Law of Thermodynamics The previous post is entitled  Partition Functions.


The posts in this series are primarily about the second law of thermodynamics, the concept of entropy, and the use and abuse of these ideas.  I would be remiss, however, not to mention information theory and the role that entropy plays.  This post is not intended to be a comprehensive introduction to information theory.  Readers especially  interested in this topic will want to read other sources in addition to this post.

Information Theory

Information theory is concerned with modeling communication systems.  These communication systems could be understood, for example, to be radio frequency communication, but the results of the theory are more  general than a specific system of encoding information and some of the consequences of the theory can be broad in scope and significance.

Ash states:
It is possible to make a case for the statement that information theory is essentially the study of one theorem, ... which states that "it is possible to transmit information through a noisy channel at any rate less than channel capacity with an arbitrarily small probability of error."
Information and Uncertainty

Information can be understood as the reduction of uncertainty.  If I say that I have a message, I have not provided much information about the content of the message.

On the other hand, if I say that the the message contains twelve words in the English language, I have reduced the uncertainty of the message content and provided information.  If I say the first word is "the," I have further reduced the uncertainty, again conveying information.

Bits

It is common to reduce information to a simple form: from ones and zeros.  A bit can be either one or zero, and by looking at a fundamental bit of information, it is possible to understand the mathematical laws that govern information.

As human beings, we may rebel against the notion that information can be conveyed as ones and zeros.  Computers are based on binary systems and we think of ourselves as being complex beings whose messages cannot be reduced to ones and zeros.

Consider, however, that the character set that I am using to write this blog is, in fact, coded in ones and zeros.  All of the great poetry of the world, such as the works of Shakespeare, Whitman, Homer, Sophocles, and Baudelaire can be encoded in ones and zeros.  If the ones and zeros are not transmitted correctly, some of the information may be lost.

-P log P

Suppose we want to convey one of sixteen possible messages that are equally probable.  It turns out that this can be done with 4 bits of data.  Each bit can have two possible values and 2 x 2 x 2 x 2 = 16.

The probability of any message is 1/16, and it requires -log(1/16) = 4 bits of information to convey, if the log is taken to be log base two for convenience.

Here we have assumed that that messages are equally probably, but if they are not, they have to be weighted by their probability.

The uncertainty of the information is the sum over all messages of the quantity - P log P. This uncertainty is also called the entropy by analogy with thermodynamics.

     S = - ΣP log P

Thermodynamic Entropy

Recall that an expression for the thermodynamic entropy is:

          S = -k ΣP*ln(P)

If we define the redefine the thermodynamic entropy as  S/k, recalling that k is merely a constant we can make the equations the same and eliminate the units. The difference between the log used is inconsequential as we can multiply by a constant and redefine the entropy appropriately.

One should be careful in taking the analogy too far.  Remember that in thermodynamics that the numbers involved are extremely large.  In most cases, information theory does not involve such large numbers.

Information theory can become quite complicated, and this discussion is only intended to be a cursory mention of the topic, not a definitive comparison.

The next post discusses the Second Law and Creationism.

Sources

Contents
  1. Introduction
  2. What the Second Law Does Not Say
  3. What the Second Law Does Say
  4. Entropy is Not a Measure of Disorder
  5. Reversible Processes
  6. The Carnot Cycle
  7. The Definition of Entropy
  8. Perpetual Motion
  9. The Hydrogen Economy
  10. Heat Can Be Transferred From a Cold Body to a Hot Body: The Air Conditioner
  11. The Second Law and Swamp Coolers
  12. Entropy and Statistical Thermodynamics
  13. Fluctuations
  14. Partition Functions
  15. Entropy and Information Theory
  16. The Second Law and Creationism
  17. Entropy as Religious, Spiritual, or Self-Help Metaphor
  18. Free Energy
  19. Spontaneous Change and Equilibrium
  20. The Second Law, Radiative Transfer, and Global Warming
  21. The Second Law, Microscopic Reversibility, and Small Systems
  22. The Arrow of Time
  23. The Heat Death of the Universe
  24. Gravity and Entropy
  25. The Second Law and Nietzsche's Eternal Recurrence
  26. Conclusion

No comments: