Given a state with many particles, if we want to guess about how we think the particles will be distributed, this is another question of how can we be optimally ignorant? From the second law of thermodynamics, from a statistical perspective, the most common macrostate is that with the highest number of equal energy microstates, the state with the highest entropy. But is there any way to “derive” the second law of thermodynamics? Let’s see…
Let the probability of a state at time be such that for all . A transition to state then is given by the probability . Interestingly, the reverse transition must have the same probability because in quantum mechanics, the probability can be computed, and the Hamiltonian of the dynamics is Hermitian. The change in the probability over time is then the additional probability of transitions to and the subtraction of transitions from , i.e. the relationship
or subtracting through and taking the limit of
using the symmetry of and developed above. This equation is referred to as the master equation for the probability of being in state . Now we define a quantity, , the mean of the natural logarithm of the probability, or the information. Computing the time derivative of the information, we have
now substituting in the master equation, we calculate the change in the entropy over time for the transition from to be
We examine the addition of the change in entropy for the transition instead from , which is
because the entropy does not depend on the transitions. Thus we can factor
where all the probabilities cancel and we are left with
so examining the term in braces we have
or finally the whole expression for the change in the entropy is,
Because the logarithm is monotonic, if , then . That means when there is a higher chance of being in state , than , the entropy decreases, and the maximum of the entropy only occurs when . This theorem is referred to as the H theorem where H denotes the entropy for Boltzmann.
So have we actually derived the idea that the maximum entropy state is that of equal probability of two substates? Well, no. The time-reversal symmetry is an approximation in that it assumes that the probabilities are always uncorrelated, but as particles interact, they become hopelessly correlated. That is, there is an underlying structure to a microstate that is beyond the white-noise approximation. That the hopelessly correlated state begins to look like a randomized, uncorrelated state, is the idea of the chaos approximation, but precisely, our assumption of time reversal symmetry is inexact, and thus this is only an approximate theorem that appears to work most of the time in practice.