What is Entropy?

Most recent answer: 07/24/2016

Q:
Hi everyone, First of all, thank you very much for this amazing website. I have a question regarding entropy from a macroscopic point of view. Entropy is a property of a system defined as dS = dQ/T. How do you calculate the entropy of something? I only found examples to calculate a change in entropy. For example, you can calculate the kinetic energy of something directly with 1/2mv�. I thought I could replace the initial state by one where the entropy is equal to zero, which should give me the entropy of the system. However, based on dS = dQ/T, how can you find dQ? Let us assume that we want to know the entropy of an isolated system. Since there is no energy transfer by heat, does that mean the entropy is equal to zero? In addition, entropy is said to be a measure of disorder or a measure of the quality of the stored energy. How can you get that from a macroscopic point of view (and particularly from dS = dQ/T)? Sorry for the long question, but I have no idea where I can find those answers except here. Thank you once again.Jack
- Jack (age 20)
Boston,MA,USA
A:

Great question. Those systems (a tube of gas, a block of copper,...) have lots of little pieces. The sort of description used in thermodynamics specifies only a few properties of the system, such as its energy, volume, etc. That leaves an enormous number of properties of the individual atoms etc. unspecified. It turns out that thanks to quantum mechanics there's actually a list of possible states of all those things, not just a continuum. How many states are consistent with your macroscopic thermodynamic description? Entropy is just the log of that number of states! We use logs because we want the entropy of two systems to be the sum of their separate entropies. The number of states is just the product of the two separate numbers (if they're independent- think of dice) and the log of a product is the sum of the logs. So the total entropy is the sum of the entropies of the independent parts.

(Due to an unfortunate historical accident, the usual thermodynamic entropy is given as Boltzmann's constant k times this log. In physics we often use nice units in which that constant is 1. The ideas are the same in any units.We also usually need a slightly fancier version of the definition to allow from different probabilities of being in different quantum states. But that's probably too much information at this point.)

So this definition of entropy allows us to talk about the absolute entropy of some system, not just how much it changes as it goes from one condition to another. The entropy only goes to zero if the system is definitely in a single quantum state, since log(1)=0. That would happen as the temperature approached absolute zero, but the temperature can't ever quite get down to zero. 

How does this connect to dS=dQ/T? That turns out to basically just be a definition of T, with the understanding that dQ is the heat flow into the system as it stays in thermal equilibrium. At low T  S grows a lot as heat flows in, at high T less so. Since the basic principle of statistical physics is that entropy grows over time, that means that heat flows from high T regions, where it doesn't make much S , to low T regions, where it makes more S.

Mike W.


(published on 07/24/2016)