In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Sal,

    I’m not stalling, I’m not trifling with your obfuscations.

    Isn’t it interesting that you’re afraid of my challenges, but I’m not afraid of yours?

    Why, it might even lead a cynical person to conclude that you’ve been bluffing and have no actual confidence in your position.

  2. keiths:

    In my thought experiment, both Xavier and Yolanda calculate the entropy (and the change in entropy) correctly.

    If you disagree, show us where either one (or both) of them makes an error.

    walto:

    Simple. As you grant that one of them has more relevant information than the other,

    That isn’t the case. Each of them has all the information s/he needs to calculate the entropy. Yolanda knows something that Xavier doesn’t, and that affects the way she determines her macrostates, but Xavier still has all the information he needs. You don’t need perfect, complete information in order to do an entropy calculation. In fact, if you do have complete information about the microstate, you’ll always get the same answer: an entropy of zero.

    …the sensible thing is to infer that Yolanda is closer to being correct than Xavier.

    No, because that implies that they are both wrong. They aren’t. Each of them has all the information required in order to calculate the entropy correctly.

    Let me repeat: Each of them has all the information required in order to calculate the entropy correctly.

    Neither one of them is cutting corners. They are both calculating the actual entropy.

    Though asked 50 times, you have provided no reason for saying they’re both right.

    I keep telling you: they’re both right because both have done their entropy calculations correctly, based on correctly ascertained macrostates.

    If either of them is wrong, where is the mistake? Each of them has correctly ascertained a (different) macrostate and used it to correctly calculate the entropy, both before and after mixing.

    Xavier knows, before mixing occurs, that each compartment contains an equal amount of X gas at the same temperature T. After mixing, he knows that this hasn’t changed: each compartment still contains the same amount of X at the same temperature T.

    He is correct. Those macrostates are true macrostates, and the corresponding entropy values he calculates are correct.

    Yolanda knows, before mixing occurs, that one compartment contains a certain amount of X0 at temperature T and that the other compartment contains the same amount of X1 at temperature T. After mixing, she knows that each compartment contains equal amounts of both X0 and X1 at temperature T.

    She is also correct. Those macrostates are true macrostates, and the corresponding entropy values she calculates are correct.

    Damon the Demon knows more about the system than either Xavier or Yolanda. In fact, he knows everything about its microstate. For him, the entropy is always zero. Like Xavier and Yolanda, he is correct.

    The claim that Xavier is correct seems to me no different from the claim that because he believes the moon is made of cheese the moon is made of cheese relative to his level of ignorance. It’s “true for him” means nothing more than that he believes it.

    No. Xavier knows the macrostates before and after mixing. He doesn’t merely believe that they hold, he knows it. His entropy calculations are correct, and they are based on true macrostates.

    Anyhow, if i’ve now seen all the ‘reasons’ you have for disagreeing with most of the scientists on this matter of relativity, I’ll leave you to your debate with Sal about the pink balls.

    You’re clearly confused and frustrated by this topic. Don’t take it out on your teacher. Work harder to understand.

  3. And of course, the underlying physical energy dispersal is the same for Xavier, Yolanda, and Damon. They’re all looking at the same system, after all.

    That’s why Sal is frightened of my challenge. He knows that it exposes a serious problem with the ‘energy dispersal’ interpretation of entropy.

    Entropy is a measure of missing knowledge, not of energy dispersal.

  4. And just to forestall another potential confusion, it isn’t that Yolanda can’t duplicate Xavier’s calculation. Her knowledge regarding the distribution of X0 and X1 doesn’t obligate her to use it in ascertaining her macrostates.

    She is free to ‘forget’ that information for the purposes of her calculation, just as a judge in a bench trial might ‘forget’ evidence that has been presented and then ruled inadmissible. Damon could forget all of his ‘excess’ information as well. In that case all three would get the same answer.

    The key point is that entropy is a function of the macrostate. If two or more observers end up with the same macrostate, for whatever reason, they’ll calculate the same entropy.

    The observer dependence comes about because macrostates themselves are observer-dependent.

  5. keiths: You don’t need perfect, complete information in order to do an entropy calculation. In fact, if you do have complete information about the microstate, you’ll always get the same answer: an entropy of zero.

    This is probably the most profound statement I have seen on this site and from keith’s no less 😉

    What it says is that
    1) Omniscience and and timelessness are equivalent.
    2) the increase in entropy is simply an artifact of our limited informational perspective
    3) Granville Sewell might very well be correct If he can show that he is talking about an actual and not a theoretical micro state

    Keith’s understanding also unites all the various understandings of the 2nd law under one single overarching paradigm.

    Like I said ,way way cool

    peace

  6. keiths: Why, it might even lead a cynical person to conclude that you’ve been bluffing and have no actual confidence in your position.

    There are no cynical people here!

  7. keiths: The key point is that entropy is a function of the macrostate.

    If this is the case, is Damon really calculating the entropy? According to you,. Damon’s knowledge is of the microstate.

    In fact, he [Damon] knows everything about its microstate.

    Put another way, if entropy is a measure of missing knowledge, and Damon has no missing knowledge, what exactly is Damon measuring?

  8. fifthmonarchyman: 1) Omniscience and and timelessness are equivalent.

    How did you arrive at that? Entropy is not a function of time. Sal is wrong about that.

    Keith’s understanding also unites all the various understandings of the 2nd law under one single overarching paradigm.

    That is one of it’s benefits.

  9. keiths: That isn’t the case. Each of them has all the information s/he needs to calculate the entropy. Yolanda knows something that Xavier doesn’t, and that affects the way she determines her macrostates, but Xavier still has all the information he needs.

    That entire post is obviously question-begging, as pretty much all of your posts have been on this thread.

    I like that you brought up Damon the Demon, because he is a perfect reductio of your position. He doesn’t know anything more than anybody else does; each person is right! For the genius, entropy isn’t lost when things die, etc. Thus, all that everybody (but you, of course) has ever learned about entropy is false. The equations long used for its measurement must have at least one relative term in them (since “ignorance” does not occur in any of them), but as it’s probably your heterodox view that EVERY scientific quantity is relative, you’re probably ok with that.

    You want the paradigm to shift, but have not a single reason why the physics Dept. of Wisc. and, likely, many other schools, should take down their site.

    Your view that “error” is necessary for an estimate not to be perfectly accurate is unsupported and implausible. If I have an old shitty (but correctly calibrated) scale that gives a weight as 65 pounds (it’s only in 5-pound increments), and you have a fancy new device that reads 28.89383 kilograms, neither of us has made an error, but one estimate is better than the other, because it is closer to being accurate. That’s how science works: it does not measure relative ignorance. Nobody cares about that. When you end up with a position according to which someone with complete knowledge of microstates has no useful knowledge of entropy as a result, since from his point of view every state is indistinguishable from every other, whether cold, hot, functioning, not functioning, dead or alive, you’ve got a very bad theory.

    Lots of luck with convincing others that what entropy measures isn’t energy but ignorance.

  10. Mung: If this is the case, is Damon really calculating the entropy? According to you,. Damon’s knowledge is of the microstate.

    In fact, he [Damon] knows everything about its microstate.

    Put another way, if entropy is a measure of missing knowledge, and Damon has no missing knowledge, what exactly is Damon measuring?

    For Damon the entropy measure of every state = 0. As I said, no more perfect reductio of keiths’ position could be found if one looked forever.

  11. fifthmonarchyman: This is probably the most profound statement I have seen on this site and from keith’s no less 😉

    What it says is that
    1) Omniscience and and timelessness are equivalent.
    2) the increase in entropy is simply an artifact of our limited informational perspective
    3) Granville Sewell might very well be correct If he can show that he is talking about an actual and not a theoretical micro state

    Keith’s understanding also unites all the various understandings of the 2nd law under one single overarching paradigm.

    Like I said ,way way cool

    peace

    Well, he’s gone a way in convincing the Calvinists. I’m guessing scientists will have a somewhat different take on a view that takes entropy as a measure of ignorance only. It’s an idealist position, actually. I think Hegel might have joined the Calvinists in cheering for it too.

  12. keiths: The key point is that entropy is a function of the macrostate.

    It’s a function of the ignorance level of observers only.

  13. keiths: And of course, the underlying physical energy dispersal is the same for Xavier, Yolanda, and Damon. They’re all looking at the same system, after all.

    And the moon is the same for all of them too. It’s just that for one of them it’s (truly) made of cheese.

  14. keiths,

    When does entropy equal zero?

    When the observer has complete knowledge of the microstate (see the example of Damon the Demon, earlier in the thread). In that case, there is only one possible microstate, and so W = 1.

    Plugging into Boltzmann’s formula, you get

    S = k ln W = k ln (1) = 0.

    I would answer at a temperature of absolute zero. Interesting disconnect and may support Einsteins comments about s=k ln W.

  15. Arieh Ben-Naim is a professor of physical chemistry who retired in 2003 at the age of 70 from the Hebrew University of Jerusalem. While reading popular science books, he disliked the authors who considered entropy as something mysterious and unclear. And so he wrote the first of his own popular science books Entropy Demystified: The Second Law Reduced to Plain Common Sense (2008) in which he started spreading the gospel that entropy, classically considered as a measure of disorder, should be replaced by a concept of missing information; it’s just a special case of Shannon’s measure of information (SMI). The same message is given in a second book A Farewell to Entropy (2008) and in Discover Entropy and the Second Law of Thermodynamics: A Playful Way of Discovering a Law of Nature (2010). Also he claims that the second law of thermodynamics is misunderstood and that entropy of an isolated system does not increase with time. His colleague Frank L. Lambert is advocating to replace the classical notion of entropy by a concept of dispersion of energy. Lambert wrote a critical review of Ben-Naim’s Entropy Demystified on the Amazon site which launched an online discussion between both where sentences were used like “…your proposal about Shannon information and S – is nothing but another word that begins with S and refers to human excrement”. Ben-Naim responded with yet another book Entropy and the Second Law: Interpretation and Misss-Interpretationsss (2012) in which he mainly defends his view against Lambert’s. In 2015 he wrote yet another book Information, Entropy, Life and the Universe. What We Know and What We Do Not Know in which he applies his ideas to explain life and the evolution of the universe.

    Review: The Briefest History of Time

    heh. Take that Salvador!

  16. colewd:
    keiths,

    I would answer at a temperature of absolute zero.Interesting disconnect and may support Einsteins comments about s=k ln W.

    That answer is inconsistent with the measurement-of-ignorance view, keiths has been pushing.

  17. Mung: Lambert wrote a critical review of Ben-Naim’s Entropy Demystified on the Amazon site which launched an online discussion between both where sentences were used like “…your proposal about Shannon information and S – is nothing but another word that begins with S and refers to human excrement”.

    Haha.

  18. fifth:

    This is probably the most profound statement I have seen on this site and from keith’s no less 😉

    What it says is that
    1) Omniscience and and timelessness are equivalent.

    No, they’re distinct. A timeless God needn’t be omniscient, and an omniscient God needn’t be timeless.

    2) the increase in entropy is simply an artifact of our limited informational perspective

    The increase in entropy is due to the fact that the gap in your knowledge of the system’s detailed state — the gap between what you know via the macrostate and what you would know if you knew the exact microstate — is widening.

    That tells you something about the system itself, not just about your own ignorance of it.

    3) Granville Sewell might very well be correct If he can show that he is talking about an actual and not a theoretical micro state

    When Granville talks about skyscrapers arising on once-barren planets and houses being assembled by tornadoes, he’s talking about macrostates, not microstates. He’s also falling hook, line, and sinker for the ‘entropy as disorder’ misconception.

  19. Most notably, his work has influenced the way in which entropy (a concept defined mathematically in the technical literature) is presented in introductory textbooks and in popular science writing.[17] Margulis and Eduard Punset have suggested that “The work of Frank Lambert, integrated into virtually all recent chemistry textbooks, makes clear that the second law is really a matter of energy dispersal.”[18] In 1999 most general chemistry texts described entropy as disorder. Since then many have shifted their emphasis to that of energy dispersal. Lambert has extensively documented the way 29 textbooks have changed in this respect up to 2012.[19]

    From Wikipedia article on Lambert– https://en.wikipedia.org/wiki/Frank_L._Lambert

  20. keiths: The increase in entropy is due to the fact that the gap in your knowledge of the system’s detailed state — the gap between what you know via the macrostate and what you would know if you knew the exact microstate — is widening.

    Maybe if you assert this a couple hundred more times, it will start to sound like you have a reason for it. “The gap between what you know via the macrostate and what you would know if you knew the exact microstate” is actually (as anybody without some weird axe to grind here can see), is simply the gap between partial and complete accuracy. As science gets better, we learn more. It’s not that we were always right.

  21. keiths:

    The key point is that entropy is a function of the macrostate.

    Mung:

    If this is the case, is Damon really calculating the entropy? According to you,. Damon’s knowledge is of the microstate.

    Every macrostate corresponds to an ensemble of microstates. Xavier’s macrostate corresponds to lots of microstates. Yolanda’s corresponds to fewer. Damon’s corresponds to exactly one.

    keiths:

    In fact, he [Damon] knows everything about its microstate.

    Mung:

    Put another way, if entropy is a measure of missing knowledge, and Damon has no missing knowledge, what exactly is Damon measuring?

    The missing knowledge. He knows the exact microstate, so there is no missing knowledge. The entropy is therefore zero.

  22. walto: So, you’re right, mung. The science books HAVE been changing. They now generally comport with Lambert’s view.

    Does that mean I am 50% correct, and what is the entropy of that!?

    So there seems to be general agreement that Larry Moran’s textbook needs to be updated. The interesting questions are: will it be, and will they use Lambert’s interpretation, and if so, why?

    I wonder if Lambert thinks the entropy of universe is increasing due to energy becoming more dispersed throughout Space and Time.

  23. keiths: When Granville talks about skyscrapers arising on once-barren planets and houses being assembled by tornadoes, he’s talking about macrostates, not microstates.

    Is there some logical reason why your concept is limited to the micro verses macro?

    peace

  24. keiths: The missing knowledge. He knows the exact microstate, so there is no missing knowledge. The entropy is therefore zero.

    I’m not sure you understood my question. Why, when speaking of Yolanda and Xavier, do you use the term macrostate, but when speaking of Damon you use the term microstate. It’s like you’re using two different terms to refer to the same thing and it clouds your argument (imo).

    Perhaps for each example you could specify the number of macrostates for each and the number of microstates for each. That might help me (assuming you’re interested in doing that, lol).

    Anyways, we are in general agreement about the informational interpretation (I think). I’m not with you yet on zero entropy cases and observer-dependence. But I am enjoying the discussion!

    I plan to do additional reading on the energy dispersal argument. I have almost all of Ben-Naim’s books.

  25. Keiths:

    Isn’t it interesting that you’re afraid of my challenges, but I’m not afraid of yours?

    Why, it might even lead a cynical person to conclude that you’ve been bluffing and have no actual confidence in your position.

    First off Keiths, a small compliment. On balance I think you’ve been a bit more civil (and less obnoxious) toward me than I toward you.

    Overall, the conversation has had some good technical discussions.

    That said, back to polemics… 🙂

    The formal definition of entropy is :

    S = k ln W

    But as I said, to average human experience this equations and defintion is mostly meaningless. In contrast look at newton’s 2nd law

    F = ma

    F = force
    m = mass
    a = acceleration

    humans can “feel” all of the variables in Newton’s 2nd law. Humans understand the concepts of “force”, “mass” (weight), “acceleration”.

    In contrast, the variables in Boltzman’s equation by contrast are purely abstract like S (entropy) and W(number of microstates).

    At issue is which metaphor is most effective in approximating the mathematical formalism of entropy and the 2nd law of thermodynamics for pedagogical purposes.

    Even Larry Moran said the “disorder” metaphor is imperfect. Some of us, perhaps you yourself, and myself included think “disorder” is a downright misleading metaphor.

    You like the ignorance metaphor for entropy a lot. You call it “lack of knowledge but we could just as well call it “ignorance”.

    The “dispersal of energy” metaphors is a better one, imho, and you’ve not overturned any of the calculations I’ve provided illustrating dispersal of energy.

    Amazingly, you came out and criticized our very own Dr. Mike who, a professional thermodynamicist who worked on low temperature cryogenic refrigerators. You said his notion of describing the 2nd law as “the speading around of energy” is a misconception. I’m fairly sure he’ll take exception to your claim and not have high regard for your expertise in thermodynamics relative to his.

    I’ve cited other chemists and physicists and textbooks and lectures, but noooo….Keiths knows for sure these guys are dead wrong for using the metaphor of “energy dispersal”.

    If I had Lambert teaching about entropy, I would have learned it so much better when I struggled learning the formalisms in undergrad and grad school.

    Your “entropy is ignorance” viewpoint I don’t think would have been a very effective teaching metaphor. Just look at the simple case of adiabatic isothermal expansion of gases. Try to explain to them the “increase of entropy is increase in ignorance”.

    Here is how I can imagine someone trying to teach entropy the Keiths way:

    Welcome Dear students to another installment of Keiths the science guy.

    Today we’ll learn about the entropy of expansion. Look at this diagram below of pink molecules in the left chamber expanding into the right chamber.

    As the pink molecules expand into the right chamber our ignorance about the pink molecules increases therefore the entropy increases. Get it. Isn’t this a great way to understand entropy?

    Entropy as ignorance is so much superior to the way Sal advocates teaching entropy. Sal relies on ideas like this by Dr. Evans who teaches chemistry. Dr. Evans teaches entropy with slides like this one. And Dr. Evans, Dr. Mike, Dr. Lambert, Dr. Leff, Dr. MC Gupta, Dr. Townsend, Dr. Triechel, Dr. Kotz and all the PhD chemists and physicists Sal cited are all wrong, and Keiths is right.

    What you need to understand dear students is Entropy is Ignorance. As the pink molecules go into the right chamber, your ignorance increases, therefore entropy increases. Get that, when gas expands, your ignorance increases! Hahaha!

    The change in ignorance is for 1 mole of monatomic ideal gas at 300K in an adiabatic isothermal process is, as Sal calculated:

    5.43 Joules/Kelvin

    But ignore for the moment that Sal got the right number. He’s dead wrong about the meaning of the number

    Now energy is measured in joules. 1 joule per second, btw, is a watt. When you run that 100 light bulb for 100 seconds, you’ve used up 100×100 watts of energy.

    When you divide the number of Joules by the Degrees Kelvin, you get Joules/Kelvin. Kelvin is the temperature.

    But Joules/Kelvin is not a measure of the spreading energy over microstates as Sal says, they are measure of ignorance.

    We measure ignorance in Joules/Kelvin. Joules/Kelvin is not a measure of spreading of energy even though Joules is a measure of energy. Got it? It’s a measure of ignorance.

    When the pink gas expands to the second chamber, you’ve increased your personal ignorance by 5.43 Joule/Kelvin or 5.43 ignorance units. Got it?

  26. The above-derived identity of statistical and thermodynamic entropies is thus readily generalizable even to assemblies of indistinguishable units insusceptible to the Boltzmann analysis heretofore employed.

    – Nash, Leonard K. Elements of Statistical Thermodynamics. p.36

    Don’t ask me to explain what that means though!

  27. stcordova: First off Keiths, a small compliment.On balance I think you’ve been a bit more civil (and less obnoxious) toward me than I toward you.

    Overall, the conversation has had some good technical discussions.

    That said, back to polemics… 🙂

    The formal definition of entropy is :

    S = k ln W

    But as I said, to average human experience this equations and defintion is mostly meaningless.In contrast look at newton’s 2nd law

    F = ma

    F = force
    m = massa = acceleration

    humans can “feel” all of the variables in Newton’s 2nd law.Humans understand the concepts of “force”, “mass” (weight), “acceleration”.

    In contrast, the variables in Boltzman’s equation by contrast are purely abstract like S (entropy) and W(number of microstates).

    At issue is which metaphor is most effective in approximating the mathematical formalism of entropy and the 2nd law of thermodynamics for pedagogical purposes.

    Even Larry Moran said the “disorder” metaphor is imperfect.Some of us, perhaps you yourself, and myself included think “disorder” is a downright misleading metaphor.

    You like the ignorance metaphor for entropy a lot.You call it “lack of knowledge but we could just as well call it “ignorance”.

    The “dispersal of energy” metaphors is a better one, imho, and you’ve not overturned any of the calculations I’ve provided illustrating dispersal of energy.

    Amazingly, you came out and criticized our very own Dr. Mike who, a professional thermodynamicist who worked on low temperature cryogenic refrigerators.You said his notion of describing the 2nd law as “the speading around of energy” is a misconception.I’m fairly sure he’ll take exception to your claim and not have high regard for your expertise in thermodynamics relative to his.

    I’ve cited other chemists and physicists and textbooks and lectures, but noooo….Keiths knows for sure these guys are dead wrong for using the metaphor of “energy dispersal”.

    If I had Lambert teaching about entropy, I would have learned it so much better when I struggled learning the formalisms in undergrad and grad school.

    Your “entropy is ignorance” viewpoint I don’t think would have been a very effective teaching metaphor. Just look at the simple case of adiabatic isothermal expansion of gases. Try to explain to them the “increase of entropy is increase in ignorance”.

    Here is how I can imagine someone trying to teach entropy the Keiths way:

    Great post, Sal. Exactly right.

  28. stcordova: Your “entropy is ignorance” viewpoint I don’t think would have been a very effective teaching metaphor. Just look at the simple case of adiabatic isothermal expansion of gases. Try to explain to them the “increase of entropy is increase in ignorance”.

    Perhaps someone who Salvador does not have on Ignore could ask him if he has ever read a book in which ” the simple case of adiabatic isothermal expansion of gases” is approached from the point of view advocated by keiths.

    Surely it’s been done.

  29. Mung:

    Why, when speaking of Yolanda and Xavier, do you use the term macrostate, but when speaking of Damon you use the term microstate. It’s like you’re using two different terms to refer to the same thing and it clouds your argument (imo).

    I think you missed this:

    Every macrostate corresponds to an ensemble of microstates. Xavier’s macrostate corresponds to lots of microstates. Yolanda’s corresponds to fewer. Damon’s corresponds to exactly one.

    For Damon the macrostate and the actual microstate are identical. For Xavier and Yolanda they are not.

    Think of it this way: the macrostate is essentially a summary of what you know about the system’s microstate.

    Let’s set thermodynamic entropy aside for the moment and talk about what I’ve been calling the “logical” entropy of a deck of cards.

    For the deck of cards, the microstate is the exact sequence of the cards. Given a deck of cards with the numbers 1 through 5 printed on them, the microstate of the deck might be the sequence 4 1 3 5 2. If the cards happen to be ordered, the microstate is 1 2 3 4 5. If they’re reverse-ordered, it’s 5 4 3 2 1.

    There are 5! possible microstates for the deck — that is, 5 x 4 x 3 x 2 x 1 possible sequences, for a total of 120.

    If the only thing I know about the deck is that it’s been randomly shuffled, then the macrostate is basically that single fact. The macrostate is “randomly shuffled”. The microstate might be any of the 120 possible sequences. We don’t know which one actually obtains.

    In other words, the macrostate “randomly shuffled” corresponds to an ensemble of 120 microstates.

    Now suppose that instead of being randomly shuffled, the deck has been prepared this way:

    1) the odd cards have been separated from the evens;
    2) the two “subdecks” have been separately shuffled; and
    3) the “odd” subdeck has been placed atop the “even” subdeck.

    Let’s call this the “odds before evens” macrostate.

    For this macrostate, the number of possible microstates is no longer 120. Some of the sequences have been ruled out, such as this one: 1 4 5 3 2. It doesn’t satisfy the “odds before evens” criterion. There are only 12 sequences that do.

    In other words, the macrostate “odds before evens” corresponds to an ensemble of 12 microstates.

    Finally, suppose you know that the deck has been prepared by placing the cards in increasing order. The macrostate is “increasing order”, and there is only one possible microstate: 1 2 3 4 5.

    In other words, the macrostate “increasing order” corresponds to an ensemble of just one microstate.

    Extra credit: Think about what happens in all of these scenarios if the cards are indistinguishable; for instance, if the cards all have the number 2 printed on them, with no other distinguishing features.

  30. Indeed, if one conceives of the mixing itself as the cause of the “entropy of mixing” then it is quite puzzling to find that the entropy of mixing is independent of the kind of the mixing molecule.

    A Farewell to Entropy. p. 267

  31. keiths: For Damon the macrostate and the actual microstate are identical. For Xavier and Yolanda they are not.

    Sorry, but two states are identical or they are not identical. Kind of a non-contradiction thing.

  32. walto,

    You’re still hung up on the idea that there is there is one true macrostate that Xavier and Yolanda are just approximating. That’s incorrect.

    Their macrostates are just as true, and just as legitimate, as Damon’s. But their macrostates correspond to large ensembles of microstates, while Damon’s only corresponds to one.

    Relating this to the deck of cards example, it’s as if Xavier’s macrostate is “no idea whatsoever about the order”, Yolanda’s is “odds before evens”, and Damon’s is “5 1 3 4 2”.

    Xavier’s macrostate is correct. He truly has no knowledge of the microstate.

    Yolanda’s macrostate is correct. The odds all come before the evens.

    Damon’s macrostate is correct. The microstate is 5 1 3 4 2, which exactly matches his macrostate.

  33. walto:

    I like that you brought up Damon the Demon, because he is a perfect reductio of your position. He doesn’t know anything more than anybody else does; each person is right!

    Read it again, walto:

    Damon the Demon knows more about the system than either Xavier or Yolanda. In fact, he knows everything about its microstate.

    Your confusion seems to be in thinking that if Damon knows the actual microstate, that means that his macrostate is the correct one, and that Xavier’s and Yolanda’s macrostates are somehow incorrect or approximate.

    That isn’t the case. See my comment above.

  34. I know your view. There’s no need to keep repeating it. What’s necessary is the production of a single reason to believe that there is not a non-relative quantity in the vicinity, just as all the chemistry texts say.

    As Sal’s last post demonstrates nicely, your position makes a complete mess of the science by calling entropy a measure of what people know or fail to know. And, apparently, you have no reason for making this mess, except that relativism about everything and the epistemological idealism that follows from it appeals to you. That’s not enough, I’m afraid.

  35. keiths: There are 5! possible microstates for the deck — that is, 5 x 4 x 3 x 2 x 1 possible sequences, for a total of 120.

    I prefer coins, where it’s simpler to determine the number of possible macrostates. I’m a Sal kinda guy.

    😉

    So my first observation is that the number of possible microstates is independent of any observer. Would you agree? Now what about all possible macrostates? Also independent of any observer?

    [1,2,3,4,5] has the same probability as [1,5,2,4,3], does it not? Of the 120 possible sequences, isn’t it the case that none is more probable than any other?

    Can you describe any macrostate that is more probable than any other? If not, then I am not sure we are talking about entropy at all. We need a probability distribution, don’t we?

    Anyways, thanks for the thought-provoking posts!

  36. Poor walto is in full “angry old man” mode.

    Let down your defenses, walto. This is a learning opportunity for you.

  37. keiths,

    That post is a great example of the reasons you have adduced for your wildly heterodox view on this thread. Pathetic.

  38. walto: Sorry, but two states are identical or they are not identical. Kind of a non-contradiction thing.

    This is why science prefers operational definitions, and why Elizabeth abhors the verb to be.

  39. Walto:

    Great post, Sal. Exactly right.

    Thanks for the kind words.

    If a hot brick and a cold brick are in contact, the heat from the hot brick spreads into the cold brick. This spreading of heat (energy) describes the increase in entropy. Simple!

    Keiths is more than welcome to describe how the transfer of heat from the hot brick to the cold brick reduces observer knowledge and thus increases entropy. Would it go something like, “when heat from the hot brick goes into the cold brick, it increases our ignorance, therefore the entropy of the two bricks goes up.” 🙄

    The metaphor of energy spreading works for me. It then makes the formalism a tad more digestable. Even though the formalisms are tedious, the qualitative changes are easily understood under the energy dispersal metaphor.

    From an MIT website, the formal change in entropy of the two bricks is shown below:

    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node41.html

    where:

    delta-S = change in entropy of the bricks

    C = heat capacity of the bricks in Joules /Kelvin
    T_H = temperature of the hot brick in Kelvin
    T_L = temperature of the cold brick in Kelvin
    T_M = temperature when the bricks reach equilibrium after the hot brick cools off and the cold brick warms up in Kelvin

  40. petrushka: This is why science prefers operational definitions, and why Elizabeth abhors the verb to be.

    As shown above, Elizabeth had no more patience for keiths’ ‘ignorance view’ than do Elzinga, Lambert, or the authors of all the chemistry textbooks.

  41. Mung:

    So my first observation is that the number of possible microstates is independent of any observer. Would you agree?

    Not quite. When we talk about “possible microstates”, we’re talking about epistemic possibility, not metaphysical possibility. After the deck has been randomly shuffled, I know nothing about the order of the cards, so all possible sequences are epistemically possible from my perspective. Metaphysically, only one sequence is possible: the one that actually obtains.

    The number of epistemically possible microstates — 120, in this case — is a function of both

    a) the fact that the deck of 5 cards can be placed in 5! = 120 sequences; and
    b) the fact that we know nothing about the order of the cards after shuffling.

    On the other hand, suppose the deck is originally in full suit and rank order. Damon the Demon watches the deck as it is being shuffled and carefully tracks the location of each card. After the shuffling is complete, he knows the exact sequence. There is only one epistemic possibility from his perspective.

    Now what about all possible macrostates? Also independent of any observer?

    No, for the same reasons.

    [1,2,3,4,5] has the same probability as [1,5,2,4,3], does it not? Of the 120 possible sequences, isn’t it the case that none is more probable than any other?

    Yes, assuming the deck is randomly shuffled.

    Can you describe any macrostate that is more probable than any other? If not, then I am not sure we are talking about entropy at all. We need a probability distribution, don’t we?

    Did you mean ‘microstate’? Assuming you did, the answer is that yes, in the general case, we need a probability distribution. That’s why Boltzmann’s famous equation — the one that appears on his tombstone (see below) — applies only to the entropy of an ideal gas, in which the microstates are equally probable.

    When the probabilities are unequal, the equation becomes more complicated.

    Anyways, thanks for the thought-provoking posts!

    You’re welcome. I’m enjoying the discussion. Too bad Sal and walto aren’t. 🙂

  42. stcordova: Thanks for the kind words.

    If a hot brick and a cold brick are in contact, the heat from the hot brick spreads into the cold brick.This spreading of heat (energy) describes the increase in entropy.Simple!

    Keiths is more than welcome to describe how the transfer of heat from the hot brick to the cold brick reduces observer knowledge and thus increases entropy.Would it go something like, “when heat from the hot brick goes into the cold brick, it increases our ignorance, therefore the entropy of the two bricks goes up.”🙄

    The metaphor of energy spreading works for me.It then makes the formalism a tad more digestable.Even though the formalisms are tedious, the qualitative changes are easily understood under the energy dispersal metaphor.

    From an MIT website, the formal change in entropy of the two bricks is shown below:

    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node41.html

    where:

    delta-S = change in entropy of the bricks

    C = heat capacity of the bricks in Joules /K
    T_H = temperature of the hot brick in Kelvin
    T_L = temperature of the cold brick in Kelvin
    T_M = temperature when the bricks reach equilibrium after the hot brick cools off and the cold brick warms up in Kelvin

    I guess that’s just MIT joining the University of Wisconsin in agreeing that entropy is a quantifiable non-observer-relative quantity. But what the hell do they know? After all, keiths has got at least one Calvinist on his side!

  43. walto:

    As shown above, Elizabeth had no more patience for keiths’ ‘ignorance view’ than do Elzinga, Lambert, or the authors of all the chemistry textbooks.

    And:

    I guess that’s just MIT joining the University of Wisconsin in agreeing that entropy is a quantifiable non-observer-relative quantity. But what the hell do they know? After all, keiths has got at least one Calvinist on his side!

    It’s amusing to watch walto and Sal digging their hole deeper while relying on bogus arguments from authority.

Leave a Reply