Entropy, Shannon's Measure of Information, and Boltzmann’s H-Theorem
We start with a clear distinction between Shannon’s Measure of Information and the thermodynamic entropy. The first is defined on any probability distribution, and therefore it is a very general concept. On the other hand entropy is defined on a very special set of distributions. Next, we show that the Shannon Measure of Information (SMI) provides a solid and quantitative basis for the interpretation of the thermodynamic entropy. For an ideal gas the entropy measures the uncertainty in the location and momentum of all the particles, as well as two corrections due to the uncertainty principle and the indistinguishability of the particles. Finally we show that the H-function as defined by Boltzmann is an SMI but not entropy.