Entropy is a fundamental concept in thermodynamics and statistical mechanics. Boltzmann's microscopic formula for entropy, given by \(S = k_B \ln w\), provides insight into how the entropy of a system is related to the number of microstates it can occupy. When dealing with two independent systems, A and B, each in their own thermodynamic state, the idea of additivity of entropy comes into play. If we consider that the systems are independent, the total number of ways both systems can be arranged, or their combined microstates, is calculated by multiplying the microstates of each system. This leads to a deep understanding:
- The expression \(w_{A+B} = w_A \cdot w_B\) reflects the way probabilities in independent systems multiply.
- Boltzmann's definition allows entropy to be represented additively when these microstates are related through a logarithmic function, thanks to the logarithm property \(\ln(ab) = \ln(a) + \ln(b)\).
Thus, for the combined system, the entropy becomes \(S_{A+B} = k_B (\ln w_A + \ln w_B) = S_A + S_B\). This sum confirms that in thermodynamics, entropy is indeed an additive quantity.