In previous posts it was shown that entropy is related to the the number of ways that a system can arrange itself subject to constraints such as constant energy.
The previous post on fluctuations showed that for very large numbers that fluctuation from the most probable distribution, are insignificant. Most of the distribution is contained within the square root of N of the most probable result.
Most Probable Distribution
If N=10, the square root of 10 is about 3.2, or about 30% of 10. Most of the distribution falls between 20% and 80% chance of getting heads. For N=100, the square root of 100 is 10, or 10% of 100. Most of the distribution falls between 40-60%.
As N increases, the distribution get tighter and tighter. For N = 1023, the distribution is for all intents and purposes a delta function. The square root of N is an insignificant percentage.
This reasoning leads to a key result: in order to understand the distribution of micro-states for a system in which N is very large, it is only necessary to be able to compute the most probable distribution!
Caveat on Spin Statistics
In this treatment, I am going to gloss over some issues of spin statistics, and treat all particles as so-called Boltzons, a theoretical particle for which we do not have to worry about spin statistics. It is important to note that the reasoning is still valid for Bosons or Fermions; it is just that a few minor adjustments have to be made. If this issue becomes relevant to the discussion, I can always revisit it.
The Molecular Partition Function
The molecular partition function can be thought of as a way of encoding the most probable distribution into an equation. The partition function, Q, can then be defined as:
Q = Σ gj exp (-Ej/kT)
The symbol, Σ, indicates a sum that is to be taken over all states j. The factor gj is the degeneracy of the jth state. That is to say that if there are two identical states at the same energy level, both need to be counted. Exp is the natural exponential function. Ej is the energy of the jth state above the ground state (The energy of the ground state is never zero because of zero-point energy, but we can define Ej = ej - e0, in which e0 is the energy of the ground state). The Boltzmann constant is k, and T is the thermodynamic, or absolute temperature in Kelvin.
Entropy in Terms of Molecular Partition Function
The entropy can be written in terms of the molecular partition function. It is possible to mathematically derive this result, but I am just going to present it:
S = kNlnQ + U/T
S is the entropy. Again, k is Bltzmann's constant. N is the number of molecules; Q is the molecular partition function. U is a macroscopic thermodynamic quantity (sometimes called E) known as the internal energy. From the first law of thermodynamics:
U = heat + work
U itself can be rewritten in terms of the partition function, but that is a bit beyond my current scope.
Breaking Down the Partition Function
The partition function can be written as a product of partition functions where we separate different types of molecular energy.
Q = Qe * Qv * Qr *Qt
Qe is the electronic partition function. For gas phase molecules Qe can be approximated to be equal to 1. Exceptions are when the electronic state is degenerate, or there is a small separation between electronic energy levels.
Analytic expressions are possible for each of the other partition functions. These expressions involve some necessary approximations, but it is possible to correct the approximations if necessary. I am not going to provide those expressions here, but the expression for the vibrational partition function will be relevant in a subsequent post addressing some nonsense from those who doubt the reality of climate change.
It is worth noting that the partition functions can be further broken down for each mode, for example the vibrational partition function can be divided into a partition function for each vibrational mode:
Qv = Qv1 * Qv2 * Qv3
The next post briefly describes the concept of entropy in information theory.
Sources
- Atkins, P. W. Physical Chemistry, W. H. Freeman and Company, New York, 3rd edition, 1986
- McQuarrie, Donal d A., Statistical Thermodynamics, University Science Books, Mill Valley, CA, 1973
- Bromberg, J. Philip, Physical Chemistry, Allan and Bacon, Inc., Boston, 2nd Edition, 1984
- Wikipedia: partition Function
Contents
- Introduction
- What the Second Law Does Not Say
- What the Second Law Does Say
- Entropy is Not a Measure of Disorder
- Reversible Processes
- The Carnot Cycle
- The Definition of Entropy
- Perpetual Motion
- The Hydrogen Economy
- Heat Can Be Transferred From a Cold Body to a Hot Body: The Air Conditioner
- The Second Law and Swamp Coolers
- Entropy and Statistical Thermodynamics
- Fluctuations
- Partition Functions
- Entropy and Information Theory
- The Second Law and Creationism
- Entropy as Religious, Spiritual, or Self-Help Metaphor
- Free Energy
- Spontaneous Change and Equilibrium
- The Second Law, Radiative Transfer, and Global Warming
- The Second Law, Microscopic Reversibility, and Small Systems
- The Arrow of Time
- The Heat Death of the Universe
- Gravity and Entropy
- The Second Law and Nietzsche's Eternal Recurrence
- Conclusion
No comments:
Post a Comment