What is Entropy?
Most recent answer: 07/24/2016
- Jack (age 20)
Boston,MA,USA
Great question. Those systems (a tube of gas, a block of copper,...) have lots of little pieces. The sort of description used in thermodynamics specifies only a few properties of the system, such as its energy, volume, etc. That leaves an enormous number of properties of the individual atoms etc. unspecified. It turns out that thanks to quantum mechanics there's actually a list of possible states of all those things, not just a continuum. How many states are consistent with your macroscopic thermodynamic description? Entropy is just the log of that number of states! We use logs because we want the entropy of two systems to be the sum of their separate entropies. The number of states is just the product of the two separate numbers (if they're independent- think of dice) and the log of a product is the sum of the logs. So the total entropy is the sum of the entropies of the independent parts.
(Due to an unfortunate historical accident, the usual thermodynamic entropy is given as Boltzmann's constant k times this log. In physics we often use nice units in which that constant is 1. The ideas are the same in any units.We also usually need a slightly fancier version of the definition to allow from different probabilities of being in different quantum states. But that's probably too much information at this point.)
So this definition of entropy allows us to talk about the absolute entropy of some system, not just how much it changes as it goes from one condition to another. The entropy only goes to zero if the system is definitely in a single quantum state, since log(1)=0. That would happen as the temperature approached absolute zero, but the temperature can't ever quite get down to zero.
How does this connect to dS=dQ/T? That turns out to basically just be a definition of T, with the understanding that dQ is the heat flow into the system as it stays in thermal equilibrium. At low T S grows a lot as heat flows in, at high T less so. Since the basic principle of statistical physics is that entropy grows over time, that means that heat flows from high T regions, where it doesn't make much S , to low T regions, where it makes more S.
Mike W.
(published on 07/24/2016)