In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. stcordova: Keiths has problems comprehending that thermodynamic entropy can be 3 things at the same time, and they are not mutually exclusive!

    Just when things were starting to become clear. =P

  2. Poor Sal. This discussion has been going on for almost a month, and he still doesn’t get that the dispute is over the interpretation of entropy, not its calculation.

    colewd,

    Since Sal has put himself in the awkward position of ignoring (or pretending to ignore) my comments, how about quoting this one to him?

    If entropy is a measure of energy dispersal, he should be able to

    1) respond to each of my my six points against dispersalism, including the rather obvious problem that entropy has the wrong units for energy dispersal;

    2) explain why Xavier and Yolanda see different entropy values despite looking at the same physical system with the same physical energy distribution;

    3) explain why entropy increases in Denker’s grinding wheel example, though energy dispersal does not; and

    4) explain why entropy “cares” about the distinguishability of particles, when energy dispersal does not.

  3. keiths,

    You may have heard of something called the “Second Law of Thermodynamics”? It has some major ramifications. One of the earliest was in establishing an upper bound on the efficiency of heat engines.

    Ok for the global view. Can you give a more specific example?

  4. colewd,

    Why are you asking to be spoon-fed? Google is your friend.

    Entropy is a central concept in thermodynamics. You’ll find plenty out there.

  5. stcordova: The major place I’ve seen entropy stated in bits is in places like TSZ and UD, not in textbook science literature.

    Statistical Thermophysics

    From the back cover:

    Introduces the ideas of information theory, surprise, entropy, probabilities, and the evolution in time

    Describes the intuitive relationship of these concepts to statistical thermophysics

    A used copy of this book costs only a few dollars.

  6. keiths,

    Since Sal has put himself in the awkward position of ignoring (or pretending to ignore) my comments, how about quoting this one to him?

    First help me understand the substance to your argument. So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1. If you want to define entropy and missing information that is fine as an intellectual exercise. What you have not convinced me of is if that definition has any use in science.

  7. colewd,

    So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1.

    Do non-binary bits seem more promising to you?

  8. hi keiths,

    The second law of thermodynamics is based on the concept of entropy. You say it establishes an upper bound on the efficiency of heat engines.

    How can this upper bound be observer-dependent? Do Yolanda, Xavier, and Damon get different answers for this upper bound?

  9. Mung,

    The upper bound on efficiency isn’t observer-dependent. The expression for this efficiency, 1 – Tcold/Thot, doesn’t even have an entropy term in it — only temperatures.

    The derivation of that formula, however, depends on the fact that the net entropy change of the system is zero in the reversible (maximum efficiency) case. After a cycle is complete, the entropy of the heat reservoir has decreased, and the entropy of the cold reservoir has increased by an equal amount. The entropy of the system itself is unchanged.

  10. There is no claim that the information entropy represents the correct expression for the thermodynamic entropy of a system that is not in equilibrium (or that it does not). The units are chosen to be those of thermodynamic entropy, since the two are equal at equilibrium. The information entropy is regarded as a measure of our ignorance of the system…

    – Robertson p. 467

  11. keiths,

    Do non-binary bits seem more promising to you?

    Yes, but that is not what Bekenstein is describing when he was calculating the entropy of a black hole. If the bit takes the form of a Tensor then you are now moving toward Sal’s description. Three positions and three momentum variables.

  12. colewd: Yes, but that is not what Bekenstein is describing when he was calculating the entropy of a black hole.

    He wasn’t calculating thermodynamic entropy.

  13. Mung,

    From Wiki

    In 1972, Bekenstein was the first to suggest that black holes should have a well-defined entropy. He wrote that a black hole’s entropy was proportional to its (the black hole’s) event horizon. Bekenstein also formulated the generalized second law of thermodynamics, black hole thermodynamics, for systems including black holes. Both contributions were affirmed when Stephen Hawking proposed the existence of Hawking radiation two years later. Hawking had initially opposed Bekenstein’s idea on the grounds that a black hole could not radiate energy and therefore could not have entropy.[9][10] However, in 1974, Hawking performed a lengthy calculation that convinced him that particles do indeed emit from black holes. Today this is known as Bekenstein-Hawking radiation. Bekenstein’s doctoral adviser, John Archibald Wheeler, also worked with him to develop the no-hair theorem, a reference to Wheeler’s saying that “black holes have no hair,” in the early 1970s.[11] Bekenstein was the first physicist to postulate such a theorem. His suggestion was proven to be unstable, but it was influential in the development of the field

  14. DNA_Jock: What would the value be if the coins were made of ununtrium, rather than copper?

    Such coins would be worthless. value = 0.

  15. DNA_Jock: What would the value be if the coins were made of ununtrium, rather than copper?

    Mung:

    Such coins would be worthless. value = 0.

    Oh, I think they’d have quite the scarcity value…
    OTOH, I doubt that they would qualify as collectible
    😉

    Still, I’d be curious as to how Sal would calculate their entropy…

  16. Mung:
    Do you think Sal knows that Clausius doesn’t always work?

    I think it’s pretty clear from his claim that
    “Claussius -> Boltzmann -> Shannon uncertainty (aka missing information)”
    that Sal has no clue whatsoever.

  17. “Certainly different people have different amounts of ignorance. The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about it’s microstate consists of the values of the macroscopic quantities Xi which define its thermodynamic state..”

    – E.T. Jaynes

    I still maintain that DamonEntropy is not thermodynamic entropy and that Jaynes can rightfully be cited in support of that position and against the position that DamonEntropy is thermodynamic entropy.

    IOW, I agree with walto. I disagree with keiths.

  18. “There seems to be a breakdown in communication that leads sincere and thoughtful people to disagree, even when they try to understand each other.”

    – Harry S. Robertson

    heh

  19. “The reactions of some readers to my use of the word ‘subjective’ in these articles was astonishing….There is something patently ridiculous in the sight of a grown man recoiling in horror from something so harmless as a three-syllable word. ‘Subjective’ must surely be the most effective scare word yet invented. Yet it was used in what still seems a valid sense: ‘depending on the observer’.”

    – E.T. Jaynes

    So I still say I don’t see a difference between ‘observer-dependent’ and ‘subjective.’

  20. Mung,

    “Certainly different people have different amounts of ignorance. The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about it’s micro state consists of the values of the macroscopic quantities Xi which define its thermodynamic state..”

    Based on this can you describe a scenario where someones ignorance is less than 100%?

  21. colewd,

    FWIW, I was introduced to the Bekenstein entropy the last day of class after we all submitted our take home final. The professor was pointing out it was something we wouldn’t be tested on. The students sat in class as a learning experience and courtesy to our professor.

    Unlike the entropy of Clausius and Boltzmann that are based on heat classical mechanics and the revision of Boltzmann that includes quantum mechanics, the Bekenstein entropy is based on general relativity.

    http://scholarpedia.org/article/Bekenstein-Hawking_entropy

    colewd:

    So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1. If you want to define entropy and missing information that is fine as an intellectual exercise. What you have not convinced me of is if that definition has any use in science.

    Let Keith compute the entropy change of a melting 20 gram ice cube using his information theory as a starting place rather than using energy dispersal parameters of heat and temperature which he so disparages!

    How about I make it easier, how about he start from Boltzmann microstates. Let him show how he calculates the number of microstates without the energy dispersal parameters of heat and temperature, but work instead from first principles of kinetic energy of each water molecule. 🙄

    Kind of hard for him to do that since he needs to sneak in the parameters that specify internal energy, like, temperature!

    If he did that, he’d be almost convincing, but he can’t even do that!

    So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1.

    One can do so for most practical systems only after calculating entropy in the traditional way and then using the conversion factors I’ve linked to, namely dividing the answer in J/K by kB and then dividing by ln(2).

    I did so in the examples above.

    Keiths’ for all his wailing can’t even do a basic calculation without energy dispersal data of heat and temperature like say for a melting ice cube. He keeps framing thermodynamic entropy in terms of information theory, but he can’t actually begin to do an entropy estimate with information theory as a starting point. He has to resort to energy dispersal data in many cases first (heat and temperature).

    In contrast, I’ve provided many examples of calculating entropy from a variety of approaches for solids, liquids, and gases.

    You’ll note, in almost all cases, implicitly the energy dispersal data is specified by:

    Temperature
    Number of Paritcles
    Volume
    Heat

    In the case of melting ice or heating copper coins in the solid phase, one only needs Temperature and Heat data.

    In no case did I compute entropy from information theory alone, and neither can Keiths, but he won’t admit he can’t compute entropy from information theory alone.

  22. colewd: Based on this can you describe a scenario where someones ignorance is less than 100%?

    Sorry, I don’t understand the question. Or perhaps I should say that I don’t see how the Jaynes quote gives reason to think that knowing the values of the macroscopic quantities Xi which define its thermodynamic state counts as 100% ignorance.

    The point is not that they have no knowledge, but rather that is their sole knowledge.

  23. stcordova: Let Keith compute the entropy change of a melting 20 gram ice cube using his information theory as a starting place rather than using energy dispersal parameters of heat and temperature which he so disparages!

    This is just ignorant and false. It’s ignorant because Salvador has keiths on ignore and so can’t see what keiths says, and it’s false because keiths has specifically denied Salvador’s claims about his views on thermodynamics.

    ETA: It’s also an absurd challenge because Salvador fails to specify the thermodynamic system, it’s initial and final equilibrium states, and it’s constraints.

  24. stcordova,

    Thanks for the link.

    If A stands for the surface area of a black hole (area of the event horizon), then the black hole entropy, in dimensionless form, is given by SBH=A4L2P=c3A4Gℏ,(1)

    where LP stands for the Planck length Gℏ/c3 while G,ℏ and c denote, respectively, Newton’s gravity constant, the Planck-Dirac constant (h/(2π)) and the speed of light. Of course, if the entropy in the usual (chemist’s) form is required, the above should be multiplied by Boltzmann’s constant k .

    For the spherically symmetric and stationary, or Schwarzschild, black hole (see Schwarzschild metric), the only parameter is the black hole’s massM , the horizon’s radius is rh=2GM/c2 , and its area is naturally given by 4πr2h , or A=16π(GM/c2)2 .(2)

    Note that a one-solar mass Schwarzschild black hole has an horizon area of the same order as the municipal area of Atlanta or Chicago. Its entropy is about 4×1077 , which is about twenty orders of magnitude larger than the thermodynamic entropy of the sun. This observation underscores the fact that one should not think of black hole entropy as the entropy that fell into the black hole when it was formed.

    For the most general type of stationary black hole, the Kerr-Newman black hole (rotating black hole), the hole’s parameters are mass M , electric charge Q and angular momentum J , and the horizon is no longer spherical. Nevertheless, in the popular Boyer-Lindquist coordinates {t,r,θ,φ} (see Misner, Thorne and Wheeler 1973 or Kerr-Newman metric) it lies at the fixed radial coordinate

    If we look at this analysis you see that based on Bekenstein’s computation of black hole (of one solar mass) entropy it is 20 orders of magnitude larger than the sun. This does not seem logical at all. Thoughts?

  25. Mung,

    Sorry, I don’t understand the question. Or perhaps I should say that I don’t see how the Jaynes quote gives reason to think that knowing the values of the macroscopic quantities Xi which define its thermodynamic state counts as 100% ignorance.

    How do you ever really know the true conditions of the micro state except what we learn from measuring the macro state? If this is true then 100% ignorance is the only possible condition.

  26. colewd:

    So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1.

    keiths:

    Do non-binary bits seem more promising to you?

    colewd:

    Yes…

    The laughter you hear is coming from folks who understand that bits — binary digits — are binary by definition.

    If the bit takes the form of a Tensor then you are now moving toward Sal’s description. Three positions and three momentum variables.

    You are confusing the representation of entropy with the representation of the microstate. They are not the same, by any stretch of the imagination.

  27. Mung,

    This is just ignorant and false. It’s ignorant because Salvador has keiths on ignore and so can’t see what keiths says, and it’s false because keiths has specifically denied Salvador’s claims about his views on thermodynamics.

    If Sal actually wants to convince anyone that he’s right, he’ll need to respond to his opponents’ arguments. Putting people on ignore — or pretending to — is self-defeating. It just demonstrates his lack of confidence.

    It’s no coincidence that Sal put me on ignore just when I asked him to respond to my six points against dispersalism. Interestingly, colewd seems to have no more faith in Sal than Sal does. Otherwise he wouldn’t have hesitated to quote my comment so that Sal could respond.

  28. colewd, to Mung:

    How do you ever really know the true conditions of the micro state except what we learn from measuring the macro state? If this is true then 100% ignorance is the only possible condition.

    colewd,

    Mung is explaining that we learn something about the microstate when we measure macroscopic variables like temperature. We still don’t know the exact microstate, but we’ve narrowed down the possibilities by making a measurement.

  29. Salvador’s behavior in this thread bothers me, but I keep reminding myself that it also creates a record. If this were Salvador@UD the posts by keiths and DNA_Jock would be altered by Sal or deleted by Sal. Salvador has me on Ignore because I’m a troll.

    Well, now we know just what that analysis of his is worth. Go Sal!

  30. colewd: How do you ever really know the true conditions of the micro state except what we learn from measuring the macro state?

    We’re talking about a reduction in the uncertainty of the microstate. And we can reduce our uncertainty. But the absolute precise exact one-and-only microstate can probably not ever be known by any human due to quantum indeterminacy.

    The system is dynamic and constantly changing, even at/near equilibrium.

    But the fact that we can’t precisely determine something doesn’t mean we can’t narrow things down.

    What the information-theory approach and the Boltzmann approach offer is is that we are talking probabilities, and that a statistical approach is close to the reality of the situation.

    colewd: If this is true then 100% ignorance is the only possible condition.

    If you’re saying that we can never know the exact microstate I tend to agree. I don’t see that as 100% ignorance. I think that the claim that if we are not 100% certain then we must be 100% ignorant is a false dichotomy.

    Take the analogy of throwing a pair of dice. We cannot be 100% certain that a seven will appear when the dice are thrown, nor can we be 100% certain that a four will not appear when the dice are thrown. I offer to play a game of dice with you where every time a seven is thrown you pay me 10 dollars and every time a four is thrown I pay you 18 dollars. Would you play this game?

  31. Mung,

    To use ‘$’ in a comment, you can type $ .

    ETA: Haha… Now a bug in the plugin is preventing me from spelling out the code. It expands ampersand expressions recursively!

    Type & followed by #36; to get the ‘$’ in a comment.

  32. Mung:

    “Certainly different people have different amounts of ignorance. The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about it’s microstate consists of the values of the macroscopic quantities Xi which define its thermodynamic state..”

    – E.T. Jaynes

    I still maintain that DamonEntropy is not thermodynamic entropy and that Jaynes can rightfully be cited in support of that position and against the position that DamonEntropy is thermodynamic entropy.

    IOW, I agree with walto. I disagree with keiths.

    The mistake you and walto are making is in thinking that macroscopic variables are the only “admissible” information when calculating entropy. They aren’t. Any information that helps narrow down the possible microstates is admissible.

    In the card deck example, “odds before evens” counts as a “macroscopic” description of the deck’s state, and knowing that the deck has “odds before evens” reduces the entropy from what it would be if all you knew was that the deck had been shuffled.

    However, macroscopic descriptions are not the only admissible ones when calculating “card deck entropy”. Suppose I shuffle the cards thoroughly and randomly so that all possible orderings are equiprobable. At this point there are 120 possible orderings — 120 possible microstates, in other words. All orderings are possible, and entropy is at a maximum.

    I look at the first card and see that it is the ‘4’. How many possible microstates are there now? Only 24. The ‘4’ is now known, and there are only 4! (4 x 3 x 2 x 1) ways of completing the sequence. The entropy has been reduced.

    Now I look at the second card and see that it’s the ‘1’. How many possible microstates are there now? Only 6. The ‘4’ and the ‘1’ are now known, and there are only 3! (3 x 2 x 1) ways of completing the sequence. The entropy has been reduced again.

    Suppose I keep going until I’ve turned over all the cards. At that point, like Damon, I know the exact sequence. There is only one possible microstate, and the entropy is therefore zero.

    It all makes sense: as I look at cards, one by one, I gain information about the state of the deck. As I gain information, the amount of missing information decreases. As the amount of missing information decreases, the entropy decreases, because entropy is a measure of the missing information. When I know the exact microstate — the exact sequence of cards — the entropy is zero.

    Anything that increases our information about the microstate reduces the entropy, and this is just as true of thermodynamic entropy as it is of “card deck entropy.”

  33. &36; priceless

    One of these days I should learn how to use latex for some other purpose.

    ETA: $

  34. keiths: The mistake you and walto are making is in thinking that macroscopic variables are the only “admissible” information when calculating entropy. They aren’t. Any information that helps narrow down the possible microstates is admissible.

    You consistently conflate Shannon entropy and thermodynamic entropy. When calculating thermodynamic entropy, the entropy is defined in terms of the macrostate variables. For example S(E,V,N).

    Any other “entropy” is not thermodynamic entropy.

  35. Mung,

    “The reactions of some readers to my use of the word ‘subjective’ in these articles was astonishing….There is something patently ridiculous in the sight of a grown man recoiling in horror from something so harmless as a three-syllable word. ‘Subjective’ must surely be the most effective scare word yet invented. Yet it was used in what still seems a valid sense: ‘depending on the observer’.”

    – E.T. Jaynes

    So I still say I don’t see a difference between ‘observer-dependent’ and ‘subjective.’

    Look at the full context of the Jaynes quote from the previous comment:

    Certainly different people have different amounts of ignorance. The entropy of a thermodynamic system is a measure of the degree of ignorance of a person whose sole knowledge about it’s microstate consists of the values of the macroscopic quantities Xi which define its thermodynamic state. This is a completely ‘objective’ quantity, in the sense that it is a function only of the Xi, and does not depend on anybody’s personality. There is no reason why it cannot be measured in the laboratory.

    [Emphasis added]

    The choice of the Xi is observer-dependent. Once they have been chosen, however, the entropy calculation is objective.

    Yolanda’s Xi include the isotopic concentrations. Xavier’s Xi do not. Both are legitimate macrostates leading to legitimate entropy values. Entropy is observer-dependent but not subjective.

  36. keiths: Anything that increases our information about the microstate reduces the entropy, and this is just as true of thermodynamic entropy as it is of “card deck entropy.”

    Apart from the macrostate variables of thermodynamics, what other knowledge (information) of the microstate is available to us mere mortals?

    We are limited to assigning probabilities and dealing with well-known probability distributions. Our only access to the system is macroscopic.

  37. keiths: Yolanda’s Xi include the isotopic concentrations. Xavier’s Xi do not. Both are legitimate macrostates leading to legitimate entropy values. Entropy is observer-dependent but not subjective.

    🙂

    It’s subjective in the sense that it is observer-dependent. If that were not the case two different observers would not arrive at two different values.

    Which of the two, Yolanda or Xavier, should revise their calculation in light of new information which they previously lacked, and why? Or why not?

  38. Mung,

    You consistently conflate Shannon entropy and thermodynamic entropy.

    Thermodynamic entropy is a Shannon entropy.

    When calculating thermodynamic entropy, the entropy is defined in terms of the macrostate variables. For example S(E,V,N).

    No, it isn’t. You don’t see E, V, or N in

    S = kb log W,

    and you don’t see them in the Gibbs equation either.

    In the case of Boltzmann, E, V, and N exert their influence via W. Whatever W is — and it can range all the way from one up to some unfathomably large value — determines the entropy. Macroscopic variables affect the value of W, but to argue that W can only be based on macroscopic variables is silly and arbitrary.

  39. keiths: Thermodynamic entropy is a Shannon entropy.

    When Ben-Naim says that thermodynamic entropy is a special case of SMI, I understand what he means because he explains what he means when he says that thermodynamic entropy is a special case of SMI.

    When you say that “thermodynamic entropy is a Shannon entropy” I have no idea what you mean because you do not explain what you mean.

    Let’s assume that not all Shannon entropy is thermodynamic entropy. How do you explain the difference? How is thermodynamic entropy different from Shannon entropy?

    Let me explain further. Ben-Naim asserts that SMI can be defined for any probability distribution (e.g., your ‘deck of cards’ examples). But thermodynamic entropy is not defined for just any probability distribution. Not every probability distribution is relevant to thermodynamics.

    Your response?

  40. keiths: and you don’t see them in the Gibbs equation either.

    Please elaborate on this statement.

    https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)#Gibbs_entropy_formula

    Each type of statistical ensemble (micro-canonical, canonical, grand-canonical, etc.) describes a different configuration of the system’s exchanges with the outside, varying from a completely isolated system to a system that can exchange one or more quantities with a reservoir, like energy, volume or molecules. In every ensemble, the equilibrium configuration of the system is dictated by the maximization of the entropy of the union of the system and its reservoir, according to the second law of thermodynamics (see the statistical mechanics article).

  41. Mung,

    One of these days I should learn how to use latex for some other purpose.

    The ‘&’ followed by ‘#36;’ trick is html, not latex, so you can use it wherever html is accepted. No latex plugin needed.

    In fact, it’s only because of the latex plugin that you need the trick in the first place.

  42. Mung,

    When you say that “thermodynamic entropy is a Shannon entropy” I have no idea what you mean because you do not explain what you mean.

    I assumed that you understood what a Shannon entropy is. If not, you could look it up.

    Let’s assume that not all Shannon entropy is thermodynamic entropy. How do you explain the difference? How is thermodynamic entropy different from Shannon entropy?

    Simple. A thermodynamic entropy is an entropy in which the microstates are thermodynamic microstates (for example, an ideal gas microstate that specifies the position and momentum of each molecule). An observer possesses a certain amount of information about the exact microstate of the system. The gap that exists between that information and the information required to pin down the exact microstate is the entropy.

    Let me explain further. Ben-Naim asserts that SMI can be defined for any probability distribution (e.g., your ‘deck of cards’ examples). But thermodynamic entropy is not defined for just any probability distribution. Not every probability distribution is relevant to thermodynamics.

    Thermodynamic entropy is defined for any epistemic probability distribution over thermodynamic microstates, just as “card deck entropy” is defined for any epistemic probability distribution over card deck microstates.

    Remember, the system is really in just one microstate at a time. The uncertainty is all in our heads — that is, the probability distribution is epistemic. Any experiment that changes the probability distribution changes the entropy. We gain information, the probability distribution changes, the entropy decreases.

    Whether the information is gained via “macroscopic” or “microscopic” observations is irrelevant. Any information that helps us zero in on the exact microstate is perfectly fine.

    And if we happen to discover the exact microstate, as Damon does in the thermodynamic case and as we can do in the card deck scenario, then the entropy is zero.

  43. Mung,

    Apart from the macrostate variables of thermodynamics, what other knowledge (information) of the microstate is available to us mere mortals?

    We are limited to assigning probabilities and dealing with well-known probability distributions. Our only access to the system is macroscopic.

    You are very close to getting it.

    The reason thermodynamic entropy calculations are usually based on macroscopic variables is not because other information is illegitimate. It’s because other information is damn hard to get!

    Contrast that to the card deck example, where “microscopic” information is easily available. We can all be Damons in the card deck world, as I explained above:

    The mistake you and walto are making is in thinking that macroscopic variables are the only “admissible” information when calculating entropy. They aren’t. Any information that helps narrow down the possible microstates is admissible.

    In the card deck example, “odds before evens” counts as a “macroscopic” description of the deck’s state, and knowing that the deck has “odds before evens” reduces the entropy from what it would be if all you knew was that the deck had been shuffled.

    However, macroscopic descriptions are not the only admissible ones when calculating “card deck entropy”. Suppose I shuffle the cards thoroughly and randomly so that all possible orderings are equiprobable. At this point there are 120 possible orderings — 120 possible microstates, in other words. All orderings are possible, and entropy is at a maximum.

    I look at the first card and see that it is the ‘4’. How many possible microstates are there now? Only 24. The ‘4’ is now known, and there are only 4! (4 x 3 x 2 x 1) ways of completing the sequence. The entropy has been reduced.

    Now I look at the second card and see that it’s the ‘1’. How many possible microstates are there now? Only 6. The ‘4’ and the ‘1’ are now known, and there are only 3! (3 x 2 x 1) ways of completing the sequence. The entropy has been reduced again.

    Suppose I keep going until I’ve turned over all the cards. At that point, like Damon, I know the exact sequence. There is only one possible microstate, and the entropy is therefore zero.

    It all makes sense: as I look at cards, one by one, I gain information about the state of the deck. As I gain information, the amount of missing information decreases. As the amount of missing information decreases, the entropy decreases, because entropy is a measure of the missing information. When I know the exact microstate — the exact sequence of cards — the entropy is zero.

    Anything that increases our information about the microstate reduces the entropy, and this is just as true of thermodynamic entropy as it is of “card deck entropy.”

    There is no rule against “microscopic” information in the card deck scenario, and likewise there is no rule against microscopic information in case of thermodynamic entropy. It’s just that microscopic information is far easier to get in the card deck case than in the thermodynamic case. Scientists use the information they have, and typically the only information they have is macroscopic.

  44. keiths:

    Yolanda’s Xi include the isotopic concentrations. Xavier’s Xi do not. Both are legitimate macrostates leading to legitimate entropy values. Entropy is observer-dependent but not subjective.

    Mung:

    It’s subjective in the sense that it is observer-dependent.

    Calling it ‘subjective’ just invites misunderstanding, which is exactly what Jaynes experienced:

    The reactions of some readers to my use of the word ‘subjective’ in these articles was astonishing….There is something patently ridiculous in the sight of a grown man recoiling in horror from something so harmless as a three-syllable word. ‘Subjective’ must surely be the most effective scare word yet invented. Yet it was used in what still seems a valid sense: ‘depending on the observer’.

    I anticipated the problem and avoided it. Jaynes had to learn the hard way.

    Which of the two, Yolanda or Xavier, should revise their calculation in light of new information which they previously lacked, and why? Or why not?

    It depends on what they are trying to achieve. They both calculated the entropy correctly, so if that is all they were trying to accomplish, then they’ve achieved the goal and nothing more needs to be done.

    Entropy can vary with time, due both to the evolution of the system and changes in the observer’s epistemic status. If the goal is to keep the entropy value up to date, then of course it needs to be recalculated when relevant changes occur.

  45. Mung,

    Take the analogy of throwing a pair of dice. We cannot be 100% certain that a seven will appear when the dice are thrown, nor can we be 100% certain that a four will not appear when the dice are thrown. I offer to play a game of dice with you where every time a seven is thrown you pay me 10 dollars and every time a four is thrown I pay you 18 dollars. Would you play this game?

    Thanks. Can you show me how to estimate the probability of a given micro state?

Leave a Reply