In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. walto,

    OTOH, keiths’ is absolutely unshifting only because he refuses to consider scenarios in which it makes no sense…

    What scenarios? If you have any scenarios in which the missing information interpretation fails, then by all means present them. That’s exactly what we’ve been waiting for!

    …or the possibility that it is only specific, entirely non-observer dependent information that matters if we settle on an ignorance theory.

    I’m happy to consider the possibility — but you have to make a case for it. What is the “specific, entirely non-observer-dependent information” you have in mind?

  2. : 2LoT Alert!

    The distinction between the energy of ordered motion and the energy of unordered motion is precisely the distinction which we have already attempted to make between energy classified as work and energy classified as heat. Our present view of the relation between entropy and probability we owe largely to the work of Boltzmann, who, however, himself ascribed the fundamental idea to Gibbs, quoting, “the impossibility of an uncompensated decrease of entropy seems to be reduced to an improbability.”

    – Lewis and Randal (1923)

  3. Mung,

    I’m still a bit puzzled about how something that is “observer-dependent” is not subjective.

    It’s because the observer’s state of knowledge regarding the system — the amount of knowledge that s/he possesses — is an objective quantity.

    If Horace knows how to solve quadratic equations, and Inigo doesn’t, then it’s objectively true that Horace can solve quadratics and that Inigo can’t. Objectively true, yet student-dependent.

    It’s the same for macrostates. Yolanda knows that there’s nothing but isotope X0 on one side of the partition and nothing but X1 on the other side. Xavier doesn’t know that; it’s all X to him.

    Each of them is objectively correct. Each possesses a different amount of information about the exact microstate, and that itself is an objective fact. Each computes a different entropy value.

  4. keiths: What scenarios? If you have any scenarios in which the missing information interpretation fails, then by all means present them. That’s exactly what we’ve been waiting for!

    Damon’s. The view that to one who knows everything, entropy is always zero is a reductio of any useful concept of entropy.

    Anyhow, first convince Jock, then Joe, then come back to me. I’m not likely to be convinced by hearing you repeat the same non-responsive stuff another fifty times.

  5. keiths: Mung:

    I try not to use the term Shannon entropy, because people confuse that with thermodynamic entropy. They are not the same.

    walto:

    Your ally here has insisted all thread that “entropy” has but one meaning, viz., missing information.

    I’ve been much more precise than that. For example:

    Entropy is a measure of missing information — the gap between the information associated with the macrostate and the information associated with the microstate.

    And:

    The missing information is the gap between what you know about the system — the macrostate — and its actual microstate at the moment. The size of that gap is the entropy, and so what the Second Law says is that the gap will (almost) always either remain the same or increase in size, with enormous probability.

    And:

    The missing knowledge is the difference between knowing the macrostate and knowing the microstate. When the temperature is absolute zero, there is no difference, and thus no missing knowledge, because there is only one possible microstate for that macrostate.

    …and so on, throughout the thread.

    Exactly. Just as I said. Maybe you should repeat it though! That will convince everybody!

  6. keiths: You’re asking for the impossible: a missing-information-based definition of entropy in which every potential observer, actual or hypothetical, possesses the same incomplete information about the system’s detailed state — no more, and no less.

    FWIW, that is nothing at all like what I suggested. But it did give you an opportunity to repeat your view!

  7. Mung:
    : 2LoT Alert!

    The distinction between the energy of ordered motion and the energy of unordered motion is precisely the distinction which we have already attempted to make between energy classified as work and energy classified as heat. Our present view of the relation between entropy and probability we owe largely to the work of Boltzmann, who, however, himself ascribed the fundamental idea to Gibbs, quoting, “the impossibility of an uncompensated decrease of entropy seems to be reduced to an improbability.”

    – Lewis and Randal (1923)

    I like that. Good quote.

  8. keiths: Suppose everyone knows the temperature and volume of a system containing an ideal gas, except for one observer who declines to measure the volume. S/he will have less information than the other observers.

    If the person does not know the volume, they cannot calculate the entropy change. In the same way, if they do not know the temperature(s), they do not know the thermodynamic system is in equilibrium, and they cannot calculate the entropy change.

    …in thermodynamics, we are discussing processes from one well-defined equilibrium state to another equilibrium state. – Ben-Naim (2016)

  9. Here’s a question:

    Why is it that the mixing of two different gases, with a positive change in entropy, is considered an irreversible process, whereas a “mixing” of the same gas, in which the entropy of the system does not change, is considered a reversible process?

  10. Mung,

    If the person does not know the volume, they cannot calculate the entropy change.

    Not so, and in any case you’re missing the point.

    Suppose I know that the temperature is 293 K. I don’t actually measure the volume, but I know by eyeballing it that it’s somewhere between one and two cubic meters.

    Those constraints constitute a macrostate. It isn’t a macrostate that a physicist or chemist would typically choose, true — but it’s a macrostate nonetheless. It limits the number of possible microstates, and its perfectly possible to calculate the entropy based on that.

    And if you’re uncomfortable with the idea of an unmeasured volume, then look at the Xavier/Yolanda/Damon case. They all know the temperature and they all know the volume, but each of them calculates a different entropy value.

    Xavier knows the temperature and volume and that there’s an equal amount of pure X gas on each side.

    Yolanda knows all of the above plus the fact that one chamber contains nothing but isotope X0 while the other contains nothing but isotope X1.

    Damon knows all of that and he knows the exact microstate.

    For Damon there is only one possible microstate — the one he already knows — and so W = 1. He plugs that into the Boltzmann equation and gets an entropy of zero.

    For Yolanda, W is huge, and for Xavier it’s huger still. They get different, nonzero values of entropy.

    You are arguing that entropy is not observer-dependent. If so, then who — if any of them — is right? And on what basis can you decide?

  11. keiths:

    If you have any scenarios in which the missing information interpretation fails, then by all means present them. That’s exactly what we’ve been waiting for!

    walto:

    Damon’s. The view that to one who knows everything, entropy is always zero is a reductio of any useful concept of entropy.

    Not at all, and I’ve explained this to you already. We are not Laplacean demons. We are humans, and we have no hope of knowing the exact microstate of even relatively simple thermodynamic systems. For us the entropy is nonzero, and it retains all of its usefulness.

    Also, the missing information interpretation works just fine in the case of Damon. It fits with the rather obvious fact that when we say that the microstates of an ideal gas are equiprobable, we are talking about epistemic probability. There’s only ever one microstate at a given moment. It’s metaphysical probability is one, and the other microstates have a metaphysical probability of zero. Only when you look at epistemic probabilities do the Boltzmann and Gibbs equations make sense.

    Damon knows more than we do. For him, the epistemic probabilities match the metaphysical probabilities. For us, they don’t.

    He sees an entropy of zero. For us it’s nonzero.

  12. keiths:
    Mung,

    Not so, and in any case you’re missing the point.

    Suppose I know that the temperature is 293 K.I don’t actually measure the volume, but I know by eyeballing it that it’s somewhere between one and two cubic meters.

    Those constraints constitute a macrostate.It isn’t a macrostate that a physicist or chemist would typically choose, true — but it’s a macrostate nonetheless.It limits the number of possible microstates, and its perfectly possible to calculate the entropy based on that.

    And if you’re uncomfortable with the idea of an unmeasured volume, then look at the Xavier/Yolanda/Damon case.They all know the temperature and they all know the volume,but each of them calculates a different entropy value.

    Xavier knows the temperature and volume and that there’s an equal amount of pure X gas on each side.

    Yolanda knows all of the above plus the fact that one chamber contains nothing but isotope X0 while the other contains nothing but isotope X1.

    Damon knows all of that and he knows the exact microstate.

    For Damon there is only one possible microstate — the one he already knows — and so W = 1.He plugs that into the Boltzmann equation and gets an entropy of zero.

    For Yolanda, W is huge, and for Xavier it’s huger still.They get different, nonzero values of entropy.

    You are arguing that entropy is not observer-dependent.If so, then who — if any of them — is right?And on what basis can you decide?

    Say it again! Say it again! Louder! Louder!

  13. walto,

    Can you come up with a scenario in which the “entropy as missing information” interpretation fails?

  14. Mung,

    Why is it that the mixing of two different gases, with a positive change in entropy, is considered an irreversible process, whereas a “mixing” of the same gas, in which the entropy of the system does not change, is considered a reversible process?

    Reversibility and irreversibility are defined with respect to macrostates, so the isentropic mixing of indistinguishable gases really isn’t a process at all. The beginning and final macrostates are the same. There’s nothing to reverse.

    There is a change in macrostate when the gases are distinguishable.

    That should give you pause. Distinguishability is observer-dependent, as the Xavier/Yolanda/Damon experiment shows. If distinguishability is observer-dependent, then entropy is observer-dependent.

  15. keiths:
    walto,

    Can you come up with a scenario in which the “entropy as missing information” interpretation fails?

    If one doesn’t beg the question as you are wont to do, it fails for Damon when he considers my cat at room temp in my house.

    Anyhow, as you are the self-appointed expert on this matter (as on so many others), why not take a break from repeating your position and have a crack at answering alan’s questions?

  16. walto,

    I’m also interested in your response to my other question:

    walto:

    OTOH, keiths’ is absolutely unshifting only because he refuses to consider scenarios in which it makes no sense or the possibility that it is only specific, entirely non-observer dependent information that matters if we settle on an ignorance theory.

    keiths:

    I’m happy to consider the possibility — but you have to make a case for it. What is the “specific, entirely non-observer-dependent information” you have in mind?

  17. walto,

    If one doesn’t beg the question as you are wont to do, it fails for Damon when he considers my cat at room temp in my house.

    In what sense does it fail? Details, please.

  18. walto,

    Anyhow, as you are the self-appointed expert on this matter (as on so many others),

    I haven’t claimed to be an expert. That I understand this stuff really seems to bother you, though.

  19. walto,

    Just won’t answer those questions, will you?

    I know you’re eager to change the subject, but we’re discussing the relative merits of the dispersalist and missing information views of entropy.

    I’ve shown you that the dispersalist view is incorrect, and you’ve been unable to refute my arguments. Meanwhile, you haven’t come up with a single scenario in which the missing information interpretation fails.

    Why are you still a dispersalist?

  20. Think of it as the ‘dispersalist’ answer if you like. I mean, you’re gonna.

    Anyhow it’s more complete than your answer to Alan.

  21. walto:

    Think of it as the ‘dispersalist’ answer if you like. I mean, you’re gonna.

    Not if you actually supply a better one. Let’s start with this question:

    keiths:

    Can you come up with a scenario in which the “entropy as missing information” interpretation fails?

    walto:

    If one doesn’t beg the question as you are wont to do, it fails for Damon when he considers my cat at room temp in my house.

    keiths:

    In what sense does it fail? Details, please.

  22. walto,

    Anyhow it’s more complete than your answer to Alan.

    If you think you can provide an argument for dispersalism or against the missing information interpretation that somehow depends on my answers to Alan’s questions, then let’s see it. Otherwise they’re irrelevant.

  23. They’re not irrelevant to what entropy is and whether the term is univocal. That’s actually the topic of this thread, your attempts to derail it notwithstanding.

  24. I say that entropy is univocal when applied to thermodynamic systems when entropy is taken to be a special case of the Shannon measure of information.

    I say that whether entropy is univocal in all the ways it’s been defined in classical thermodynamics is not at all clear.

    Sort of like all the different formulations of the second law of thermodynamics.

  25. walto,

    You, Sal and I have been debating the relative merits of the dispersalist and missing information views of entropy for days. You have been full-throated in your defense of dispersalism and in your disdain for the MI view. I can see why you might want to change the subject at this point — your position is clearly untenable.

    If you can come up with with an argument for dispersalism, or against the missing information interpretation, that somehow depends on knowing my answers to Alan’s questions, then I will happily answer them. Otherwise, they’re irrelevant.

    Meanwhile, please explain to us how this amounts to a failure of the MI interpretation:

    If one doesn’t beg the question as you are wont to do, it fails for Damon when he considers my cat at room temp in my house.

    It’s your assertion. Back it up.

  26. If keiths is attempting to derail the thread I think the mods should be warned so they can be on the lookout for an actual derailment. Or does potential derailment v. actual derailment really matter? The rule is really rather vague on that.

  27. keiths: You, Sal and I have been debating the relative merits of the dispersalist and missing information views of entropy

    Maybe that’s what YOU’VE been debating. For my own part, I have simply expressed concerns and questions about your information view, largely because you’re so wedded to it. You have preferred not to answer those questions and concerns, but to attack the dispersalist position instead. You have tried to make me the chief defender of the dispersalist view, for some weird reason, (maybe because you love your six points more than life itself and there’s nobody else around who wants to talk to you? Dunno). I have pointed out–what should be obvious even to babes in the woods–that two of those beloved six points simply beg the question in favor of your view by repeating it, and that one of your favorite arguments for the information view is either inconsistent with the proposition that entropy is always increasing or is a concept that is utterly useless for the physical sciences, and you have responded by simply repeating your view a couple of dozen more times.

    That’s been this thread in a nutshell. Congrats.

    Would make a person suspicious. I mean, if one leaned in that direction.

    BTW, have you convinced Jock or Joe yet? I think you’ve got pedant and maybe mung, but petrushka’s last remark didn’t make a ton of sense, so you may have lost him when you lost Jock. Keep posting, baby!

  28. Mung: I say that entropy is univocal when applied to thermodynamic systems when entropy is taken to be a special case of the Shannon measure of information.

    Hah. It’s univocal just in case it’s used in one particular way. Good.

  29. keiths: That should give you pause. Distinguishability is observer-dependent, as the Xavier/Yolanda/Damon experiment shows.

    And I say the particles are distinguishable or they are not distinguishable, regardless of the observer. To believe that the particles are not distinguishable, if they are in fact distinguishable, is to hold an erroneous belief.

    Now perhaps you could explain how distinguishability enters into the Boltzmann equation. How does whether or not one believes the particles to be indistinguishable impact the actual probability distribution?

    You do not claim that the actual physical distribution of the particles is observer-dependent, do you? That’s where I believe you go wrong, and it’s a point I’ve raised earlier that I don’t think you ever responded to.

    Given the differences between the two cases there is an objective probability distribution that obtains regardless of observer. Do you disagree?

  30. walto: Hah. It’s univocal just in case it’s used in one particular way. Good.

    Yes. 🙂

    Entropy is a well known concept, and what it appliea to is a well-known concept. I think it’s unfortunate that Shannon used the exact same term and just muddied the waters. I like how Ben-Naim is consistent in referring to Shannon’s concept as SMI and keeping it distinct from entropy.

    Also, I’ve been in the “entropy as information” camp for a number of years and had no idea keiths held similar views. As such I was probably already predisposed against the “dispersalist” position. I’m also predisposed to doubt anything Salvador says about entropy and the second law, lol!

    But I think I’ve evaluated keith’s arguments about dispersion independently of what you or Sal thought. I don’t care if you’re a “dispersionist” as long as you don’t try to impose your view on others. 😉

  31. Mung: I don’t care if you’re a “dispersionist” as long as you don’t try to impose your view on others.

    Hah. I’m neither dispersionist, dispersalist.nor disperado. I’m just not convinced you can completely strip entropy of that concept and still have it useful for themodynamics.

  32. walto: Hah. I’m neither dispersionist, dispersalist.nor disperado. I’m just not convinced you can completely strip entropy of that concept and still have it useful for themodynamics.

    The concept of dispersion of energy? IMHO, thermodynamics (and entropy) got along fine without that concept for quite some time. I quoted Guggenheim as the one who perhaps first introduced that interpretation of entropy.

    Wikipedia claims it was Kelvin.

    https://en.wikipedia.org/wiki/Entropy_(energy_dispersal)

    But:

    This article’s factual accuracy is disputed. Please help to ensure that disputed statements are reliably sourced.

  33. Walto tries to distance himself from dispersalism:

    I’m neither dispersionist, dispersalist.nor disperado.

    That’s odd. He was still a dispersalist when he wrote this:

    His [Damon’s] mistake is your mistake, that of thinking that by “entropy” we were looking for a measurement of ignorance rather than of dispersal.

    I wonder why he changed his mind, and why he didn’t mention it to us? (Those are rhetorical questions.)

  34. walto,

    You have tried to make me the chief defender of the dispersalist view, for some weird reason…

    No, I’ve addressed my arguments to both you and Sal. When Sal bailed, you were the only one left defending dispersalism, so of course I addressed my arguments to you at that point.

    I have pointed out–what should be obvious even to babes in the woods–that two of those beloved six points simply beg the question in favor of your view by repeating it…

    And each time you make that claim I explain why it is incorrect. Physicists and chemists, including Lambert, agree with me on the necessity of taking distinguishability into account when computing the entropy change caused by the mixing of two gases. They also agree on the validity of Boltzmann’s and Gibbs’s equations.

    Keep in mind that a couple days ago you actually believed that it was Boltzmann against the chemists:

    One side has Boltzmann’s equation, the other has working chemists. And so the fight continues.

    It’s not. Everyone (except maybe you) accepts the Boltzmann and Gibbs equations.

    …and that one of your favorite arguments for the information view is either inconsistent with the proposition that entropy is always increasing or is a concept that is utterly useless for the physical sciences, and you have responded by simply repeating your view a couple of dozen more times.

    No, I’ve explained why it’s the other way around:

    walto:

    Again, this is the reductio of your position. Your Laplacian genius can’t tell the entropy of any state from any other. It makes the entire concept useless

    keiths:

    It’s exactly the opposite. It shows that your position is absurd and that it renders entropy comparisons useless.

    Here’s why. According to you, Xavier and Yolanda both calculate the wrong entropy value, but Yolanda’s value is closer to being correct because she has more information than Xavier — she knows about the isotope distribution, while he doesn’t. By that reasoning, Damon’s answer is better than both Xavier’s and Yolanda’s, because he has more information than either of them. In fact, his information is perfect — he knows the exact microstate. No observer could do better. Therefore, by your logic, Damon’s entropy value — zero — is the correct value.

    Your reasoning leads to the (faulty) conclusion that entropy comparisons are useless, because the correct value is always zero. In my scheme, entropy comparisons are useful because unlike Damon, we never have perfect information about the microstate. Entropy values are nonzero for us.

    That was a pretty good foot-shot, walto, but not as good as Sal’s.

  35. Mung,

    And I say the particles are distinguishable or they are not distinguishable, regardless of the observer.

    That gets you into trouble. To a super-observer like Damon, all the molecules are distinguishable. He knows the position and momentum of each one and can track its motion. He never confuses one molecule with another.

    If you count the number of microstates assuming perfect distinguishability, à la Damon, you won’t get the same answer as the physicists and chemists, and your entropy value will differ accordingly.

    Now perhaps you could explain how distinguishability enters into the Boltzmann equation. How does whether or not one believes the particles to be indistinguishable impact the actual probability distribution?

    There’s a huge impact. The number of microstates goes up by a factor of N! when the molecules are individually distinguishable, as they are for Damon (where N is the number of molecules.) When N is an Avogadro-sized number, N! is staggering.

    The probability distribution remains flat, however.

    You do not claim that the actual physical distribution of the particles is observer-dependent, do you? That’s where I believe you go wrong, and it’s a point I’ve raised earlier that I don’t think you ever responded to.

    Are you really asking about the physical distribution of the particles, or did you mean to ask about the probability distribution of the microstates? Or something else?

  36. Mung, I think that if we forget about quantum indeterminacy, and you believe all of the following:

    1. that entropy is largely a count of available microstates;
    2. that the “availability of microstates” is a probability distribution;
    3. that probabilities are “objective” in the sense of not being relative to anybody’s (or everybody’s) knowledge of facts; and
    4. that determinism is true;

    then the entropy of every macrostate will be zero. You will, that is, have described the world according to keiths’ Damon.

    keiths thinks that the problem of every macrostate having an objective entropy quantity of zero is solved by making the probabilities (or available microstates) relative to what somebody knows; in other words, by making the quantity of “order” relative to a concept of order and knowledge (or, I guess knowledge*) of what’s going on in the macrostate under investigation. If your sense is that this violates our intuition that increasing entropy is an objective property of our physical world, I agree with you.

    As I indicated above, to satisfy intuitions that the amount of entropy of a macrostate is both objective and not always zero, (and to also accommodate some of the problems with the dispersal view that have been pointed out on this thread, I think one has to take entropy as a mixed bag.

    I tried to capture that above (in a bit of a hand-wavy fashion) when I wrote this:

    walto: Suppose (this is just a hypothetical!) that the only relevant missing information were to involve energy dispersal with respect to a non-observer-relative perspective. What I mean is that the amount that can be missing from any perspective is defined in such a way that it must, e.g., vary directly with temperature. And the definition entails that there can be no perspective according to which the missing information can be zero unless certain physical properties obtain. (That’s to eliminate the Damon reductio.)

    If that were the case, wouldn’t this whole bruhaha just be a quibble? Is it strictly a measure of missing info? Yes. Is the measurement also necessarily related to energy dispersal? Yes.

    Now, I happily concede that I am not competent to provide the details here, but while I could be wrong, my impression is that it wouldn’t be that hard for those who know the subject well–and has likely already been done by the conversion of Boltzmann’s equation into the equations that Sal and the scientists are using to do their measurements. (As I’ve mentioned, one conversion requires the use of degrees of freedom.)

    The point is that one can concede that Jock and keiths are correct that there are some problems with the dispersion view, and that the three of you are right that there are merits to looking at entropy as measure of available information without throwing the baby out with the bathwater.

  37. walto: As I indicated above, to satisfy intuitions that the amount of entropy of a macrostate is both objective and not always zero, (and to also accommodate some of the problems with the dispersal view that have been pointed out on this thread, I think one has to take entropy as a mixed bag.

    Or (I think) you could assume the incarnation. An objective perspective that is also limited in knowledge.

    peace

  38. Fmm, I know my worries about you are increasing, whatever may be happening in the entropy department.

  39. walto: then the entropy of every macrostate will be zero.

    Determinism implies that the universe can be viewed as static. Oddly enough, some theistic views agree.

  40. Walto:

    Mung, my sense is that Sal’s definition is shifting because the concept is both ambiguous and, in some sense determined by its history and Boltzmann “ownership”, but he’s been trying to produce a single definition that handles all the cases..

    Hi Walto:

    The most accurate and comprehensive definition of entropy is Boltzmann’s which came years after Clausius:

    S = k ln W

    where W is the number of microstates, but what a microstate is, is non-trivial.

    Prior to Boltzmann, the definition entropy was indirectly described by Clausius

    dS = dq (reversible)/T

    Clausius definition dominates scientific and engineering practice because it is relatively easy to calculate as shown by the exam question for a melting ice cube. Let someone try to do this with the Boltzmann definition, and they are asking for a major headache.

    “Energy dispersal” is a metaphor. It is not rigorous, it is a pedagogical qualitative description to help make sense of the Clausius definition.

    The ignorance approach will proceed from the Boltzmann definition. Let Keiths try to calculate entropy change for a melting ice cube as was done on a basic chemistry exam as I provided from a real university. Using Boltzmann is non-trivial.

    To feel versant in understanding Boltzmann microstates, one needs to understand the six-dimensional phase-space where microstates are defined, and to understand six-dimensional phase space, one needs to understand thing like the Lioville Theorem.

    https://en.wikipedia.org/wiki/Liouville%27s_theorem_(Hamiltonian)

    A chem student doesn’t need those headaches! I provided links to how he can calculate what he needs for his assignments.

    There is an old saying, “if you want to understand entropy, calculate it.”

    “Energy dispersal” is the dumbed-down version I recommend for understanding. It is non-rigorous, but I think it is a heck of a lot easier to describe entropy of melting ice-cube or chemical reactions in terms of energy than ignorance.

    Keths, Mung, DNA_jock need to show how they apply their approaches and conception of entropy to a melting ice cube.

    Using the Clausius (energy) approach vs. Boltzman (counting microstates) is much easier.

    Example:

    Energy in melting the ice cube : 6660 Joules
    Temperature: 273 K

    Entropy change = 6660 J / 273 K = 24.4 J/K

    Simple!

    I invite my detractors to calculate entropy change using Boltzmann’s definition and ignorance and information theory approaches.

    As far as practical college level chemistry, the discussion of information theory is not of immediate importance for their needs in computing entropy change.

  41. Big thanks to Joe and Walto

    I find most amusing that Sal’s original OP thesis is contradicted by the very sources he cites:

    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node54.html

    Joe’s and Walto’s comments have caused me to engage in some google-whacking.

    ITMT, I am particularly grateful for Joe’s citation of http://entropysite.oxy.edu/shuffled_cards.html

    If I am understanding correctly – this long exchange revolves about the resolution of the so-called “Gibbs Paradox”

    I offer a relevant link that is somewhat readable for laymen.

    https://en.wikiversity.org/wiki/Statistical_thermodynamics#Calculating_the_statistical_entropy_for_a_classical_ideal_gas

    which refers to
    http://bayes.wustl.edu/etj/articles/gibbs.paradox.pdf

    a relevant quote from the wikiversity link:

    Are there actually two Gibbs paradoxes? Should they be called “paradoxes”? I would say yes and yes. Gibbs addressed both the mixing-of-very-similar-gasses paradox in thermodynamics, and the mixing-of-identical-gases extensivity paradox in statistical mechanics. These are two different but closely related things, and there very well may be two Gibbs paradoxes, as discussed in the article. Gibbs did not see either as an actual problem, and he never called them paradoxes, yet he did remark on them as possible places where one could make a logical error. In some sense then they really are paradoxes, along the lines of the two-envelope paradox and the Monty Hall paradox: things that can seem on the surface mysterious or counterintuitive yet which have deep down a well-defined, consistent answer.

    back to lurk mode

  42. Hey, Sal, welcome back!
    Glad to see you write:

    The most accurate and comprehensive definition of entropy is Boltzmann’s which came years after Clausius…
    “Energy dispersal” is a metaphor. It is not rigorous, it is a pedagogical qualitative description to help make sense of the Clausius definition…
    The ignorance approach will proceed from the Boltzmann definition.

    Yup.

    Using Boltzmann is non-trivial.

    As I noted.

    “Energy dispersal” is the dumbed-down version [Sal] recommend[s] for understanding. It is non-rigorous…

    So, Sal, we really agree that if you actually want to understand entropy, then you need Boltzmann. If you merely want to plug an entropy term into an engineering calculation, then you can use whichever equations are easiest to manipulate (so long as your understanding is sufficient to spot when those equations might lead you astray…).
    Kinda like using Newtonian mechanics: wrong, but close enough for Apollo, if not for GPS.

    You left without answering my questions about your calculation of the entropy of 1555 grams of copper = 812.42 J/K.
    One of your inputs was the specific heat of copper = 0.39 J/gram/K.
    Do tell:
    1) Do you normally quote results to 5 sig figs, when one of your inputs has 2 sig fig precision?
    2) How was that 0.39 J/g/K value derived? Experimentally?
    3) Is it constant?
    I would encourage you to think about the implications of your answers to 2 and 3.
    To claim that your Classical Thermodynamics High School homework is anything more than mere phenomenology, you are going to have to derive the specific heat of copper.
    🙂

Leave a Reply