In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. And just to spell it out, differences in knowledge lead different observers to plug different values of W into Boltzmann’s formula, leading to different values of entropy.

  2. Keiths:

    No, that’s your problem.

    More on this after I finish my response(s) to walto.

    Keiths,

    Nothing personal here Keiths, but you’re wrong and belligerent. So is DNA_jock.

    I backed you in your KF tackcles transfinite thread, but I can’t let you get away saying I didn’t understanding something when the problem is you are misreading phrases and trying to push your misreading onto me as some sort of refutation. I’ll have to take pretty sharp exception to what you and DNA_jock are doing in this thread.

    When you’re right, I’ll agree, and I have in the past, but not this time because you’re wrong.

    First off, is there any disagreement in diagram below that the energy in the pink molecules is dispersed after the valve is opened? Better not be!

    The diagram below shows an isothermal expansion of pink molecules. So, there should not be any objection to saying the energy dispersal description of entropy applies to the case of isothermal expansion of the pink species of ideal gas.

    For the reader’s benefit, I will calculate the change in entropy change in the isothermal expansion of 1 mol of pink molecules at 300 K as in the diagram below. We can call this entropy change “expansion entropy”.

    Assuming the change in volume accessible to the pink gas is from 1 to 2 cubic meters.

    Specifically for this case, the change in entropy (the “expansion entropy”) is:

    delta-S_expansion_pink_molecules = N kB ln 2

    based on the formula you can find here regarding “free expansion”
    http://web.mit.edu/16.unified/www/SPRING/propulsion/notes/node40.html

    where
    N = number of molecules
    kB = Boltzmann’s constant

    delta-S_expansion_pink_molecules = (1 mol) (1.38 x 10^-23 J/K) (ln 2) ( Avogadro’s Number)

    = ( 1 mol) (6.022 ×10^23 / mol) (1.38 x 10^-23 J/K) (ln 2) =

    5.43 J/K

    You got any problem with that? That’s entropy that can be described by Energy Dispersal.

    delta-S_expansion_pink_molecules = 5.426 J/K

    We can do the same if we initially had blue molecules in the right chamber and a vacuum in the left chamber.

    delta-S_expansion_blue_molecules = 5.426 J/K

    That entropy change can be qualitatively described as energy dispersal as well. You got any problem with that?

    So what is the mixing entropy if we had pink molecules on the left and blue molecules on the right initially and then open the valve?

    Uh,

    entropy_of_mixing_pink_and_blue =

    delta-S_expansion_blue_molecules + delta-S_expansion_blue_molecules

    = 5.426 J/K + 5.426 J/K = 10.85 J/K (rounded)

    This result agrees with equation 14 here:
    http://www.cpp.edu/~hsleff/entlangint.pdf

    If you doubt this, look at this derivation from a link about 139 comments earlier (139 if we don’t count the comments by Mung that I ignore). My comment was here:

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    and the link for your convenience is re-provided here:

    http://rkt.chem.ox.ac.uk/lectures/entropy.html

    Using accepted mainstream references, I demonstrated the mixing entropy can be described as the sum of the entropies of expansion.

    Since the entropies of expansion can be described by the energy dispersal description, and since entropies of mixing can be described by the sum of the entropies of expansion, the entropies of mixing can be described by the energy dispersal description as provided by Lambert, Leff, Gupta, Evans, Kotz, Treichel , Townsend, etc.

    What I said also justifies this Georgia Tech chemistry lecture slide which you are so eager to berate.

    You and DNA_Jock can wallow in you’re willful misreadings and equivocations of what is meant by “energy dispersal” as used by the physicists and chemists I cited.

  3. I like some of keiths examples. I have never really thought of entropy being observer-dependent, but I can certainly see where keiths is coming from, because I tend to think of entropy in terms of information.

    I’m not great at coming up with examples but here’s an attempt:

    Say you have four boxes that look and feel equally alike, one of which has a marking inside. You ask two people to guess which box has the marking the one who guesses in the fewest number of attempts wins. One of the two people is blind.

    Both people would pick a box and ask – is this the one? It could take up to four tries to find the right box assuming they are intelligent enough to not pick the same box more than once.

    The boxes are indistingishable.

    Now say you have four boxes, two blue and two red. One blue box has a raised checkered pattern and one red box has a raised checkered pattern. One has the marking inside. Now both people can devise an optimal strategy.

    Is it in a red box? Is it in a checkered box? The boxes are distingishable, even for the blind person.

    Anyways, this example probably sucks. Heck, I think it sucks.

  4. Alan Fox:

    It’s a shame that Mike Elzinga and OlegT no longer comment here. There is quite a bit buried in old threads. Sal helpfully linked to a comment by Mike Elzinga here.

    But we have Dr. Mike’s postings at panda’s thumb.

    http://pandasthumb.org/archives/2013/09/lifes-ratchet-b.html

    The only general rules are conservation of energy (first law), the spreading around of energy (second law), energy flows from higher temperatures to lower temperatures (basically the “zeroeth” law), and entropy goes to zero when all energy is drained out of the system (third law).

    Spreading? As in dispersal!

    That sounds a lot like Lambert:

    Entropy change measures the dispersal of energy: how much energy is spread out in a particular process, or how widely spread out it becomes (at a specific temperature).

    http://entropysimple.oxy.edu/content.htm

    Hahaha. Take that Keiths and DNA_jock!

  5. Sal,

    I’ll have more to say on the subject of dispersal later, but in the meantime, you still haven’t addressed my challenge. Why are you avoiding it?

    (That’s a rhetorical question. You and I both know why.)

    If you were confident in your position, you wouldn’t hesitate to tackle the challenge.

  6. OK, Joe F. There you have the arguments (such as they are). You ought to be able to make your decision now.

  7. For the reader’s benefit, this present disagreement between Keiths and I centers around his statement:

    Keiths:

    I appreciate Lambert’s efforts to expose the “entropy is disorder” misconception, but the idea he’s offering in its place — “entropy is energy dispersal” — also appears to be a misconception.

    So with regards to Dr. Mike’s statement:

    The only general rules are conservation of energy (first law), the spreading around of energy (second law), energy flows from higher temperatures to lower temperatures (basically the “zeroeth” law), and entropy goes to zero when all energy is drained out of the system (third law).

    Does Keiths object to Dr. Mike’s framing the second law in terms of the spreading around of energy? Do you therefore think Dr. Mike’s conception of the second law of thermodynamics is a misconception?

    A simple, “yes” or “no” will suffice Keiths. 🙂

    How about you, DNA_Jock, a simple “yes” or “no” will suffice. You should attempt an answer since you present yourself as someone so knowledgeable in these matters that you can confidently declare Keiths as correct.

  8. keiths,

    Is it fair to say that you believe that the entire idea of the 2nd law is nothing but a statement about our finite limited knowledge of systems?

    Does entropy have any meaning at all for an omniscient being in your opinion?

    peace

  9. It is ironic that very recently some scientists who accept the informational interpretation of entropy resurrected the subjectivity into entropy.

    – Arieh Ben-Naim

    Is it possible then to accept the informational interpretation of entropy and also maintain that entropy is objective?

  10. Information, Entropy, Life and the Universe
    What We Know and What We do Not Know
    Arieh Ben-Naim
    World Scientific, 2015.

    Chapter 2: Entropy
    2.3.4 Is entropy a subjective quantity?

  11. What I disliked most [about Hawking’s A Brief History of Time] was the misconstrued association of time with the second law of thermodynamics, which I felt should not have been written by a scientist of Hawking’s stature.

    – Arieh Ben-Naim

    ouch

    No wonder us mere mortals get confused about the second law.

  12. walto: And??

    LoL. I’ll post more from that section. What, you want me to just hand out the answer and bypass all the suspense?

  13. The increase in entropy comes when a known distribution goes over into an an unknown distribution … Gain in entropy always means loss of information, and nothing more.

    – Lewis, G.N. (1930)

    Does this mean entropy is information? Because if so, it could explain why some people might think entropy is subjective, as is not information subjective?

  14. Entropy is not a measure of disorder, disorganization, or chaos. Entropy is not a measure of spreading of energy. Entropy is not a measure of freedom and entropy is not information and entropy is not time.

    – The Briefest History of Time: The History of Histories of Time and the Misconstrued Association between Entropy and Time. p. xiii

    Sorry Sal.

  15. Mung,

    Entropy is not a measure of disorder, disorganization, or chaos. Entropy is not a measure of spreading of energy. Entropy is not a measure of freedom and entropy is not information and entropy is not time.

    When does entropy equal zero?

  16. Mung, walto,

    “Observer-dependent” does not imply “subjective” here.

    I’ve chosen my words carefully, and you’ll notice I haven’t referred to entropy as “subjective”.

  17. colewd, to Mung:

    When does entropy equal zero?

    When the observer has complete knowledge of the microstate (see the example of Damon the Demon, earlier in the thread). In that case, there is only one possible microstate, and so W = 1.

    Plugging into Boltzmann’s formula, you get

    S = k ln W = k ln (1) = 0.

  18. Sal,

    Does Keiths object to Dr. Mike’s framing the second law in terms of the spreading around of energy?

    Yes.

    Do you therefore think Dr. Mike’s conception of the second law of thermodynamics is a misconception?

    Yes.

    Good grief, Sal. Haven’t you been reading my comments?

    I’ve been quite clear:

    Entropy is a measure of missing knowledge, not of energy dispersal.

  19. Sal,

    Drop the arguments from authority.

    “But Lambert says…”
    “But Dr. Mike says…”
    “But there’s a slide on the Internet that says…”
    “But accepted mainstream references say…”

    This is not an election. What matters is whether an idea is right, not how popular it is.

    Now stop stalling and address my challenge.

    You ought to be grateful. I’m presenting you with a beautiful opportunity to demonstrate the superiority of your position over mine. If you were actually confident of your position, you’d be jumping at the chance, not stalling.

    By stalling, you might as well be hanging a sign around your neck saying “I know I’m wrong, but I can’t bear to admit it”.

  20. keiths:
    Mung, walto,

    “Observer-dependent” does not imply “subjective” here.

    I’ve chosen my words carefully, and you’ll notice I haven’t referred to entropy as “subjective”.

    Still have given no reason for either one, however.

  21. keiths:
    Sal,

    Drop the arguments from authority.

    “But Lambert says…”
    “But Dr. Mike says…”
    “But there’s a slide on the Internet that says…”
    “But accepted mainstream references say…”

    This is not an election.What matters is whether an idea is right, not how popular it is.

    Now stop stalling and address my challenge.

    You ought to be grateful.I’m presenting you with a beautiful opportunity to demonstrate the superiority of your position over mine.If you were actually confident of your position, you’d be jumping at the chance, not stalling.

    By stalling, you might as well be hanging a sign around your neck saying “I know I’m wrong, but I can’t bear to admit it”.

    Rather than giving opportunities, you might consider giving a single reason for the entire scientific establishment to shift its paradigm.

  22. Mung:
    Entropy is not a measure of disorder, disorganization, or chaos. Entropy is not a measure of spreading of energy. Entropy is not a measure of freedom and entropy is not information and entropy is not time.

    – The Briefest History of Time: The History of Histories of Time and the Misconstrued Association between Entropy and Time. p. xiii

    You’re so mysterious, mung! STOP BEING SO MYSTERIOUS!!

  23. walto,

    Still have given no reason for either one, however.

    I was hoping you’d be able to figure it out. 🙂 Okay, I’ll explain.

    What’s confusing you, I think, is that statements about entropy values are indexical. The truth of indexical statements is relative to the speaker, but that doesn’t mean that they aren’t objective.

    Example: “I’m in California” is true when spoken by me, but false when spoken by you (unless you’re doing some traveling that I’m unaware of.) It’s “speaker-dependent”, but that doesn’t make it subjective. It’s objectively true that I’m in California and that you are not.

    Same for entropy. “The entropy is N J/K” will be true if spoken by Xavier regarding system X but false when spoken by Yolanda of the same system. That doesn’t make it subjective. It’s objectively true when spoken by Xavier, and objectively false when spoken by Yolanda. It depends on their relative knowledge of the system’s detailed state.

  24. walto,

    Rather than giving opportunities, you might consider giving a single reason for the entire scientific establishment to shift its paradigm.

    Must you always resort to your “grumpy old man” shtick?

    I’ve given reasons throughout the thread. That you don’t understand them doesn’t mean they aren’t there.

  25. keiths:
    walto,

    I was hoping you’d be able to figure it out. 🙂Okay, I’ll explain.

    What’s confusing you, I think, is that statements about entropy values are indexical.The truth of indexical statements is relative to the speaker, but that doesn’t mean that they aren’t objective.

    Example:“I’m in California” is true when spoken by me, but false when spoken by you (unless you’re doing some traveling that I’m unaware of.)It’s “speaker-dependent”, but that doesn’t make it subjective.It’s objectively true that I’m in California and that you are not.

    Same for entropy.“The entropy is N J/K” will be true if spoken by Xavier regarding system X but false when spoken by Yolanda of the same system.That doesn’t make it subjective.It’s objectively true when spoken by Xavier, and objectively false when spoken by Yolanda. It depends on their relative knowledge of the system’s detailed state.

    Thanks much for your assistance, but I’m not actually confused about the distinction you’re trying to make above. I don’t care about it, I’m looking for a single reason for your claim that all the scientists are wrong.

    I’m going to ask this one final time (although I already kind of know it’s pointless). WHY SHOULD ANYBODY BELIEVE THAT ENTROPY IS OBSERVER-DEPENDENT?

    ETA: I see this wasn’t actually the final time, since I’ve asked it a couple more times below. So you have several chances here to answer.

  26. keiths: I’ve given reasons throughout the thread.

    I haven’t seen one. You either repeat the claim or change the subject (as you’ve done above with the distraction about subjectivity). Either you have a reason for distinguishing the measurement of entropy from most other types of scientific measurement (or indeed non-scientific measurements) or you don’t. If you have a reason, I wish you would give it. Why is the situation with Yolanda and Xavier essentially different from you with a thermometer or microscope and me without them?

    Can you answer that or not?

  27. walto,

    I’m going to ask this one final time (although I already kind of know it’s pointless). WHY SHOULD ANYBODY BELIEVE THAT ENTROPY IS OBSERVER-DEPENDENT?

    Lol. Will it help if I respond in bold and all caps?

    BECAUSE DIFFERENT OBSERVERS CAN GET DIFFERENT ANSWERS EVEN WHEN THEY ALL CALCULATE THE ENTROPY CORRECTLY.

    In my thought experiment, both Xavier and Yolanda calculate the entropy (and the change in entropy) correctly.

    If you disagree, show us where either one (or both) of them makes an error.

  28. Entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property not of a physical system, but of the particular experiment you or I choose to perform.

    – Jaynes, E.T. (1965)

    I don’t know what to make of that statement. We can certainly measure the entropy of a physical system.

  29. colewd: When does entropy equal zero?

    Personally, I don’t think it does. If keiths and Sal disagreed and I had to choose one I’d be in a real quandary. 😉

  30. stcordova: delta-S_expansion_blue_molecules + delta-S_expansion_blue_molecules

    = 5.426 J/K + 5.426 J/K = 10.85 J/K (rounded)

    Cool calculation, Sal, assuming you meant blue + pink there.
    Let’s repeat this experiment a million times, but with balls that are dark red and light red. But in each of the experiments, we vary the hue of the balls ever so slightly.
    The entropy of mixing does not change at all, it is always 10.85 J/K, except when the balls are exactly the same color. At this one moment, the entropy of mixing suddenly jumps to ZERO.
    The question I have been asking you repeatedly, and you have (as usual) not even tried to answer, is WHY is this the case under your “spreading” definition? WHY is there this sudden plummet in the mixing entropy when we go from red balls with ever-so-slightly darker red balls to indistinguishable balls. You’ll notice that it follows automatically from keiths’s definition. Your definition seems sadly lacking in this regard.
    As I have alluded to previously, Lett and Lambert (and Elzinga) have a justification (a get-out-of-jail-free card) for this exception. I understand this justification. What I am trying to see is, can Sal explain it in his own words.
    Without some tiny demonstration of understanding on your part, Sal, all you are doing is making blind appeals to authority.
    As usual.

  31. It is not just the number of ways a system can be rearranged, but more specifically the number of rearrangements consistent with the known properties of the system. Known by whom? Measured by what observer?

    By this reckoning entropy is not an absolute property of a system, but relational. It has a subjective component – it depends on the information you happen to have available.

    Boltzmann understood this connection and made it more specific. He pointed out that since the value of entropy rises from zero when we know all about a system, to its maximum value when we know least, it measures our ignorance about the details of the motions of the molecules of a system. Entropy is not about speeds or positions of particles, the way temperature and pressure and volume are, but about our lack of information.

    – von Baeyer (2003)

    keiths, do you agree with von Baeyer? Disagree? Please explain.

  32. Mung,

    keiths, do you agree with von Baeyer? Disagree? Please explain.

    I agree, though I would substitute “observer-dependent” for “subjective” in his description, for reasons given above.

  33. Sal,

    Does Keiths object to Dr. Mike’s framing the second law in terms of the spreading around of energy?

    Yes.

    Do you therefore think Dr. Mike’s conception of the second law of thermodynamics is a misconception?

    Yes.

    Good grief, Sal. Haven’t you been reading my comments?
    I’ve been quite clear:

    Entropy is a measure of missing knowledge, not of energy dispersal.

    Keiths,

    Well, first of all, thank you very very very much for responding to my question directly and promptly. It’s much more satisfying dealing with you than UD’s Kairos Turbo Encabulator Unfocused.

    Alan Fox,

    I wasn’t trying to brush you off or ignore you, I was trying to get a few confessions out of Keiths first such as the one above.

    But at least now you know where Keiths’ views stand in relation to Mike Elzinga’s.

    Dr. Mike hates creationists. Ironically, it should be evident my views about entropy are closer to Dr. Mike’s than Keiths.

    If Dr. Mike were here, he might take exception to what Keiths just said.

  34. colewd, to Mung:

    When does entropy equal zero?

    Keith’s answer:

    When the observer has complete knowledge of the microstate (see the example of Damon the Demon, earlier in the thread). In that case, there is only one possible microstate, and so W = 1.

    Plugging into Boltzmann’s formula, you get

    S = k ln W = k ln (1) = 0.

    Mung’s answer:

    Personally, I don’t think it does.

    You don’t think that your God can have perfect knowledge of a system’s microstate?

  35. It is ironic that the same confusion of SMI with information – which caused the rejection of the informational interpretation of entropy – brougth back the subjectivity as an attribute of entropy.

    Besides, this quotation by von Baeyer alludes to Boltzmann’s understanding of the “subjectivity” of entropy. As far as I know, Boltzmann never discussed the subjectivity of the entropy. It is not true either that “the value of entropy rises from zero when we know all about a system… .”

    Having found numerical agreement between the calculated and the experimental values of the entropy provides a solid interpretation of the entropy. For the question as to what entropy is, we can answer that entropy is a measure of the uncertainty or of the amount of information associated with the equilibrium distribution of locations and momenta of all the particles. It has no traces of subjectivity in it.

    – Ben-Naim, Arieh. Information, Entropy, Life and the Universe. p. 197.

  36. keiths: Bingo. Ben-Naim and I are on the same page regarding subjectivity.

    I’ll take your word for it. 🙂

    Don’t take any further posts quoting Ben-Naim on the subjectivity of entropy as disagreement with you on that. I think he does a decent job of explaining why entropy is not subjective.

  37. It may be helpful to take a brief pause to see why entropy is so contentious.

    In human experience, our senses and the way we’re wired enables us to directly have a qualitative idea of important physical quantities such as:

    velocity
    length
    distance
    acceleration
    temperature
    pressure
    charge (sort of)
    time
    mass (weight)

    IIRC most every thing in physics (including the above concepts) can be derived from :

    time
    mass
    charge
    length

    Entropy can be derived from these 4 as well, but there isn’t much in human experience we can point to which describes entropy. Some have tried to say its disorder, but as several, including Larry Moran, have pointed out, it is not a perfect qualitative description — and some will argue it is downright wrong. The Boltzmann plank formalism:

    S = k ln W

    doesn’t resonate with anything in personal experience. In contrast, heat, temperature, acceleration, length, mass, time — we can all identify with without much argument about what each of these concepts are.

    There isn’t much in the way of human experience to describe entropy. It’s an abstract accounting entity that helps us build refrigerators, air conditioners, steam engines and analyze chemical reactions.

    In a process where temperature is kept constant (an isothermal process) and there is no heat exchange with the surroundings (an adiabatic process) we can make a visual illustration such as the isothermal adiabatic expansion of pink molecules.

    If the volume the pink molecules is doubled, and the entropy is increased (but not in such a straight forward linear fashion) — rather than multiplying the quantity kB N by 2 we multiply by ln(2).

    delta-S = N kB ln (2)

    if the volume were increased by a factor of 4

    delta-S = N kB ln (4)

    if the volume were increased by a factor of 8

    delta-S = N k ln (8)

    etc.

    Thus, the simple diagram below can be used to give a visual illustration of increase in entropy providing we are dealing with an isothermal adiabatic expansion. As Dr. Mike and Lambert said, entropy is qualitatively about spreading of energy. The particles (and therefore their energy) in the diagram below are spread.

    How the spreading of energy is converted to numbers is a bit tedious and not always straight forward.

  38. E.T. Jaynes:

    Entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property not of a physical system, but of the particular experiment you or I choose to perform.

    Mung:

    I don’t know what to make of that statement. We can certainly measure the entropy of a physical system.

    I don’t think he’s saying that the entropy of a system can’t be measured, but rather that the value you get depends on the experiment you perform on the system.

    It’s hard to be sure without the context, but I’d guess he’s saying that different experiments leave you with differing amounts of ignorance with regard to the microstate, and thus different conclusions about the entropy value.

  39. Walto,

    I’ve steered clear so far about the observer dependent entropy issue. But I’ll now add my thoughts.

    It seems to me, from a practical standpoint we want heating and air conditioning engineers, biochemists (like Larry Moran) to have numbers for entropy that are observer independent. A lot of analysis of biochemical reactions involves predictable values of entropy change. It’s not desirable for laboratories to come up with their own entropy change numbers based on their own perspectives.

    As far as practical matters are concerned, I showed a few examples here how numbers for entropy change can be calculated.

    There is in physics and mathematics the notion of uncertainty. For physics this is the minimum amount of uncertainty that must exist in principle (such as in the Heisenberg principle). This is the minimum uncertainty that would result in measuring apparatus. This is uncertainty that is there in principle, not uncertainty that is dependent on the observer.

    Knowledge is considered the opposite of uncertainty. Reduction of uncertainty is increase in knowledge. When someone increases knowledge, he reduces uncertainty. When he increases uncertainty, he loses knowledge.

    But we shouldn’t, imho, equivocate human (or observer) uncertainty with the uncertainty that exists in principle. The two are not the same!

    An example is Shannon uncertainty. We are much more familiar with Shannon uncertainty than we realize. When we say a Compact Disc has 750 mega bytes of memory, it has 750 mega bytes of Shannon uncertainty. It is the amount of uncertainty in principle, not the observer’s arbitrary statement of how much memory is on the Compact Disc. If two observers use the same measuring instruments (like say a standard disk drive) they should be able to arrive at the same number of bytes (Shannon uncertainty) that the Compact Disc can hold.

    In the case of thermodynamic entropy, we also measure the uncertainty (in principle) of a system. But this isn’t the same uncertainty as whatever a arbitrary observer is uncertain about.

    When a gas of pink molecules occupies a volume of only 1 cubic meter, we have an uncertainty of position of any given molecule’s position to an extent of 1 cubic meter. If we allow the pink molecules to expand to fill 2 cubic meters, we’ve increased the uncertainty in each molecule’s position to 2 cubic meters.

    But expressing entropy in terms of the degree of uncertainty (as in the prior paragraph) isn’t as intuitive as simply describing how much the molecules are spread out!

  40. stcordova: As Dr. Mike and Lambert said, entropy is qualitatively about spreading of energy.

    As Ben-Naim said, entropy is not a measure of spreading of energy.

  41. keiths: In my thought experiment, both Xavier and Yolanda calculate the entropy (and the change in entropy) correctly.

    If you disagree, show us where either one (or both) of them makes an error.

    Simple. As you grant that one of them has more relevant information than the other, and they have different answers, the sensible thing is to infer that Yolanda is closer to being correct than Xavier. Though asked 50 times, you have provided no reason for saying they’re both right.

    The sensible conclusion is that neither is quite right. Yolanda is closer because she has superior measuring equipment and is measuring a non-relative quantity.

    The claim that Xavier is correct seems to me no different from the claim that because he believes the moon is made of cheese the moon is made of cheese relative to his level of ignorance. It’s “true for him” means nothing more than that he believes it.

    Anyhow, if i’ve now seen all the ‘reasons’ you have for disagreeing with most of the scientists on this matter of relativity, I’ll leave you to your debate with Sal about the pink balls.

  42. Mung: As Ben-Naim said, entropy is not a measure of spreading of energy.

    My guess is that, as Lambert says, the term is used differently by different people. Sometimes even by the same people at different times!

  43. Sal,
    You’re stalling.
    Meanwhile, the challenge awaits.

    I’m not stalling, I’m not trifling with your obfuscations.

    You’ve been pretty badly humiliated in this exchange.

    You gave up the store with this:

    Obviously, the “pink” molecules are more dispersed afterwards than before, and so is the energy.

    This is true in the expansion scenario, true in the mixing scenario. Therefore you have no leg to stand on.

    In my monoatomic example of 1 mole at 300K, that 3741.3 Joules of energy in the pink molecules gets dispersed from 1 cubic meter to 2 cubic meters after the barrier is removed. This holds for both the isothermal adiabatic expansion and mixing scenarios. However we can see before mixing the concentration of energy in the pink molecules is 3741.3 Joules/ cubic meter whereas after it is only 3741.3 / 2 = 1870.65 Joules/cubic meter. The energy becomes more dispersed after mixing (or expansion)!

    Now if we have pink molecules in both the right and left chamber, the Joules of energy are spread out on the pink molecules as much as before as they are after mixing. If we have 1 mol of pink molecules on the right chamber and 1 mol of pink molecules on the left chamber both at 300K, the total energy of the pink molecules is:

    2 x 3741.3 Joules = 7482.6 Joules

    So the concentration of energy in the pink molecules before the value is opened is 7482.6 Joules / 2 cubic meters = 3741.3 Joules/cubic meter. And after the valve is opened, the concentration of energy is the same at 3741.3 Joules/cubic meter.

    So there is no increase in dispersal of energy, therefore no increase in entropy, therefore Keiths is wrong yet again!

    As far as your silly challenge you need to distinguish between “uncertainty in principle” vs. lack of capability or personal ignorance.

    The pink molecules are not distinguishable from each other in principle. That’s not the same thing as Xaavier not having the capability to distinguish blue from pink because he’s color blind. Sheesh!

Leave a Reply