What is entropy? - Jeff Phillips
- 2,309,196 Views
- 6,436 Questions Answered
- TEDEd Animation
1. Disorder is a subjective description that has no rigorous definition.
2. Perhaps more importantly, disorder is usually taken to be a comment on the arrangement of a single microstate, yet entropy is defined in terms of all of the microstates for an energy configuration. Frank Lambert has written many articles on interpreting the meaning of entropy.
So, if entropy is not disorder, what is it? The formal definition offered by Ludwig Boltzmann (and later written on his tombstone) is S= kBlnW, where S is the entropy of the system in a particular energy configuration, kB= 1.380×10−23J/K (Boltzmann's constant) and lnW is the natural logarithm of number of microstates for that energy configuration, or macrostate. While this is the formal definition, it doesn’t provide us with much physical intuition. It helps to think about entropy as a measure of the spread of energy. To see this in more detail, consider reading this paper and performing some of the interactive demonstrations therein.
Essential to the Boltzmann definition of entropy is that energy is quantized. The fact that energy comes in an integer multiple of quanta makes the number of microstates countable. If each bond of our Einstein solid could have any value of energy (such as 1.25 or 345.8461 quanta), then there would be an infinite number of microstates for any energy configuration. Our analysis works because there is a finite number of microstates for each energy configuration. This is yet another piece of evidence that our universe is governed by quantum, rather than classical physics.
Thomas Moore has developed a web application that models our two Einstein solid system. It allows the user to explore how as the size of the system increases, the probability for the most likely energy configuration increases relative to other less likely configurations.
Create and share a new lesson based on this one.