A previous post discusses the macroscopic thermodynamic definition of entropy, but there is another, statistical way of describing entropy. Consider an isolated macroscopic system of interacting molecules. Without knowing much about what is going on with the individual molecules, it is possible to measure macroscopic thermodynamic properties such as the pressure, the temperature etc.
Consider that the system is isolated; so that the total energy of the entire system of molecules is a constant. Energy is free to move from one molecule to another, and each molecule has multiple electronic, vibrational, rotational, and translational energy states that it could be in. There are many distinguishable ways that the system could be arranged to achieve the this energy.
The number of ways of arranging a system as a function of energy is sometimes referred to as the density of states of the system and can be thought of as the number of ways that the system can be arranged per unit energy range. In general as energy is increased, the density of states increases rapidly.
To better explain these concepts, imagine a simplified system.
Let there be N molecules. In general N is a very large number. For example 18 grams of water has Avogadro's number 6.022 x 1023 of molecules. To keep that math simple, let N=1023.
Case A: Consider if the energy is so low that all molecules are forced to be in their ground state (and assume for simplicity that there is no degeneracy of the ground state). In such a case, there is only one way to arrange the collection of molecules.
Case B: Now let's increase the energy a bit and allow one molecule to be in an excited energy state. For simplicity, let's assume that all the molecules have a state available and any molecule in the collection could be the one to have this small amount of energy.
In case A, there is only one way to arrange the system. In case B, there are N ways to arrange the system as each of the N molecules could be the one to be in an excited state.
Case C: Now let's allow two molecules to be in an excited state. Two molecules are in an excited state and N-2 molecules are in the ground state. There are (N/2)(N-1) or 5 x 1045 distinguishable ways to arrange the system if N=1023.
Case D: As the energy increase the number of ways to arrange the system grows by leaps and bounds. Let there be three molecules in an excited state. There are N!/3!(N-3)! or (1/6)(N)(N-1)(N-2) ways to arrange the system.
Exercises for the reader:
1) Calculate the value of N!/3!(N-3)! if N=1023. 2)What if there were four molecules in an excited state?
To cut to the chase, the entropy can defined as follows:
S = -k ΣPn*ln(Pn)
Where k is the Boltzmann constant 1.38 x 10-23 J/K (It is equal to the universal gas constant R (8.314 J/K*mol) divided by Avogadro's number). The sigma is a symbol mean take the sum of the terms to the right.
The argument of the summation sign is the probability that the system is in a given allowable configuration, multiplied by the natural log of that probability. The sum is taken over all possible configurations of the system. Pn is less than or equal to 1; so ln(Pn) is a negative number or zero ensuring that that the entropy is zero or greater.
There are some intricacies that involve taking into account whether particles are fundamentally indistinguishable and obey Bose-Einstein or Fermi-Dirac statistics, but I am neglecting the issue here.
If all the states have the same probability, we do not have to worry about weighting the logs of the individual probabilities and one gets the equation on Boltzmann's tombstone.
S = k log W
in which log refers to the natural log.
To explain this concept further, the next post looks at fluctuations from the most probable value.
Sources
- Atkins, P. W. Physical Chemistry, W. H. Freeman and Company, New York, 3rd edition, 1986
- McQuarrie, Donal d A., Statistical Thermodynamics, University Science Books, Mill Valley, CA, 1973
- Bromberg, J. Philip, Physical Chemistry, Allan and Bacon, Inc., Boston, 2nd Edition, 1984
- Anderson, H.C., Stanford University, Lectures on Statistical Thermodynamics, ca. 1990.
- Boltzmann's Tombstone
- Introduction
- What the Second Law Does Not Say
- What the Second Law Does Say
- Entropy is Not a Measure of Disorder
- Reversible Processes
- The Carnot Cycle
- The Definition of Entropy
- Perpetual Motion
- The Hydrogen Economy
- Heat Can Be Transferred From a Cold Body to a Hot Body: The Air Conditioner
- The Second Law and Swamp Coolers
- Entropy and Statistical Thermodynamics
- Fluctuations
- Partition Functions
- Entropy and Information Theory
- The Second Law and Creationism
- Entropy as Religious, Spiritual, or Self-Help Metaphor
- Free Energy
- Spontaneous Change and Equilibrium
- The Second Law, Radiative Transfer, and Global Warming
- The Second Law, Microscopic Reversibility, and Small Systems
- The Arrow of Time
- The Heat Death of the Universe
- Gravity and Entropy
- The Second Law and Nietzsche's Eternal Recurrence
- Conclusion
No comments:
Post a Comment