9/11 Acquisition Reform Advertising Alaway Alcohol Ale Allergies Antisemitism Barack H. Obama Beer Billiards Biology Books Budget Bureaucracy California Capitalism Carbohydrates Carcinogen CDC Chemical Warfare Chemistry Chemophobia Chirality Climate Science Colonial Pines Computers Conservation Laws Constitution Consumerism Cosmology CPT Invariance Creationism Customer Service Daesh David Irving Dead End Defense Dinosaurs Disasters Economic Energy English Ethics Evolution Fluoride Food FTL Garden Care George W. Bush Gerlich and Tscheuschner GISS Glaciers GMOs HadCRU Haiti Health Himalayan Rock Salt HITRAN Holocaust Denial Home Brewing How It Looks From Here html Humor Information Infrared Spectroscopy IPCC Iran ISIS Islam Islamophobia Israel Ketotifen Fumarate Law Lawn Care Leibniz Lisbon Magnetism Math Medco Medicine Modeling Molecules Monopoly Monsanto Naphazoline hydrochloride Neutrinos Nietzsche NIH NIST Noether's Theorem Non-hazardous Norton Ghost Nuclear Warfare Oil Oil Spill Olopatadine hydrochloride Opinion Orson Scott Card Parody Pataday Patanol Pesticides Pheneramine maleate Physics Plumbing Politics Poll Pope POTUS Prescriptions Prop 65 Psychology Quantum Mechanics Quiz Racism Radiative Transfer Relativity Religion Respiration Senior Housing Signs Smoking Specific Gravity Statistics Stock Market Sugars Sun Tzu Surface Temperature Surgeon General Symantec Target Temperature Terrorism The Final Solution The Holocaust History Project Thermodynamics Time Trains Units Voltaire von Clausewitz Weather White House Wine Yeast

Saturday, November 6, 2010

Entropy and Statistical Thermodynamics

This post is part of a series, Nonsense and the Second Law of Thermodynamics. The previous post is entitled The Second Law and Swamp Coolers.

A previous post discusses the macroscopic thermodynamic definition of entropy, but there is another, statistical way of describing entropy.  Consider an isolated macroscopic system of interacting molecules.  Without knowing much about what is going on with the individual molecules, it is possible to measure macroscopic thermodynamic properties such as the pressure, the temperature etc.

                                                                                         (Figure Source)

Consider that the system is isolated; so that the total energy of the entire system of molecules is a constant.  Energy is free to move from one molecule to another, and each molecule has multiple electronic, vibrational, rotational, and translational energy states that it could be in.  There are many distinguishable ways that the system could be arranged to achieve the this energy.

The number of ways of arranging a system as a function of energy is sometimes referred to as the density of states of the system and can be thought of as the number of ways that the system can be arranged per unit energy range. In general as energy is increased, the density of states increases rapidly.
To better explain these concepts, imagine a simplified system.

Let there be N molecules. In general N is  a very large number.  For example 18 grams of water has Avogadro's number 6.022 x 1023 of molecules. To keep that math simple, let N=1023.

Case A: Consider if the energy is so low that all molecules are forced to be in their ground state (and assume for simplicity that there is no degeneracy of the ground state).  In such a case, there is only one way to arrange the collection of molecules.

Case B: Now let's increase the energy a bit and allow one molecule to be in an excited energy state.  For simplicity, let's assume that all the molecules have a state available and any molecule in the collection could be the one to have this small amount of energy.

In case A, there is only one way to arrange the system.  In case B, there are N ways to arrange the system as each of the N molecules could be the one to be in an excited state.

Case C: Now let's allow two molecules to be in an excited state.  Two molecules are in an excited state and N-2 molecules are in the ground state.  There are (N/2)(N-1) or 5 x 1045 distinguishable ways to arrange the system if  N=1023.

Case D: As the energy increase the number of ways to arrange the system grows by leaps and bounds. Let there be three molecules in an excited state.  There are N!/3!(N-3)!   or (1/6)(N)(N-1)(N-2) ways to arrange the system. 

Exercises for the reader: 

1) Calculate the value of N!/3!(N-3)! if N=1023.   2)What if there were four molecules in an excited state?

To cut to the chase, the entropy can defined as follows:

     S = -k ΣPn*ln(Pn)

Where k is the Boltzmann constant 1.38 x 10-23 J/K (It is equal to the universal gas constant R (8.314 J/K*mol) divided by Avogadro's number)The sigma is a symbol mean take the sum of the terms to the right.

The argument of the summation sign is the probability that the system is in a given allowable configuration, multiplied by the natural log of that probability.  The sum is taken over all possible configurations of the system.  Pn is less than or equal to 1; so ln(Pn) is a negative number or zero ensuring that that the entropy is zero or greater.

There are some intricacies that involve taking into account whether particles are fundamentally indistinguishable and obey Bose-Einstein or Fermi-Dirac statistics, but I am neglecting the issue here.

If all the states have the same probability, we do not have to worry about weighting the logs of the individual probabilities and one gets the equation on Boltzmann's tombstone.

     S = k log W

in which log refers to the natural log.

To explain this concept further, the next post looks at fluctuations from the most probable value.

  • Atkins, P. W. Physical Chemistry, W. H. Freeman and Company, New York, 3rd edition, 1986
  • McQuarrie, Donal d A., Statistical Thermodynamics,  University Science Books, Mill Valley, CA, 1973 
  • Bromberg, J. Philip, Physical Chemistry, Allan and Bacon, Inc., Boston, 2nd Edition, 1984
  • Anderson, H.C., Stanford University, Lectures on Statistical Thermodynamics, ca. 1990.
  • Boltzmann's Tombstone
  1. Introduction
  2. What the Second Law Does Not Say
  3. What the Second Law Does Say
  4. Entropy is Not a Measure of Disorder
  5. Reversible Processes
  6. The Carnot Cycle
  7. The Definition of Entropy
  8. Perpetual Motion
  9. The Hydrogen Economy
  10. Heat Can Be Transferred From a Cold Body to a Hot Body: The Air Conditioner
  11. The Second Law and Swamp Coolers
  12. Entropy and Statistical Thermodynamics
  13. Fluctuations
  14. Partition Functions
  15. Entropy and Information Theory
  16. The Second Law and Creationism
  17. Entropy as Religious, Spiritual, or Self-Help Metaphor
  18. Free Energy
  19. Spontaneous Change and Equilibrium
  20. The Second Law, Radiative Transfer, and Global Warming
  21. The Second Law, Microscopic Reversibility, and Small Systems
  22. The Arrow of Time
  23. The Heat Death of the Universe
  24. Gravity and Entropy
  25. The Second Law and Nietzsche's Eternal Recurrence
  26. Conclusion

No comments: