In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. walto,

    You quoted Jaynes a couple of days ago.

    I was surprised:

    It’s odd that you’d cite Jaynes on this, since his quote undermines the argument you’ve been making.

    Here’s the quote with some additional context. It comes from Jaynes’ paper Gibbs vs Boltzmann Entropies:

    VI. THE “ANTHROPOMORPHIC” NATURE OF ENTROPY

    After the above insistence that any demonstration of the second law must involve the entropy as measured experimentally, it may come as a shock to realize that, nevertheless, thermodynamics knows of no such notion as the “entropy of a physical system.” Thermodynamics does have the concept of the entropy of a thermodynamic system; but a given physical system corresponds to many different thermodynamic systems…

    Consider, for example, a crystal of Rochelle salt. For one set of experiments on it, we work with temperature, pressure, and volume. The entropy can be expressed as some function Se(T,P). For another set of experiments on the same crystal, we work with temperature, the component exy of the strain tensor, and the component Pz of electric polarization; the entropy as found in these experiments is a function Se(T, exy, Pz). It is clearly meaningless to ask, “What is the entropy of the crystal?” unless we first specify the set of parameters which define its thermodynamic state.

    One might reply that in each of the experiments cited, we have used only part of the degress of freedom of the system, and there is a “true” entropy which is a function of all these parameters simultaneously. However, we can always introduce as many new degrees of freedom as we please…

    Our crystal is now a thermodynamic system of over 1000 degrees of freedom; but we still believe that the laws of thermodynamics would hold. So, the entropy must be a function of over 1000 independent variables. There is no end to this search for the ultimate “true” entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking about thermodynamics!

    From this we see that entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

  2. walto,

    That quote is a pretty devastating argument against your view and for mine.

    Your mistake was to bet the farm on this single sentence:

    But just at that point the notion of entropy collapses, and we are no longer talking about thermodynamics!

    That’s an overstatement by Jaynes. Thermodynamics doesn’t cease when the entropy becomes zero — that’s what happens when a system is at absolute zero, after all. Likewise, statistics doesn’t cease when a probability becomes one. Those are just special cases of thermodynamics and statistics, respectively.

    The entire quote shows why your position is untenable. There is no single “true” entropy of a physical system. It depends on the experiments chosen by the observer; that is, it depends on the observer’s choice of macrostate. This remains true even when all of the macrostate parameters come from Walto’s List of Approved Thermodynamic Variables.

    There’s yet another lesson here about arguments from authority: Don’t try to make them if you don’t understand what your chosen authority is saying. It can backfire badly on you, as it did here.

  3. Apologies to commenters upthread to whom I owe replies. A lurker has very kindly supplied me with material (including Ben-Naim’s Entropy Demystified.) It seems more useful to work through that than continue here.

  4. keiths: That’s an overstatement by Jaynes.

    I get it, I shouldn’t have used the quote because you disagree with it. Thanks for the tip.

  5. Alan Fox:
    Apologies to commenters upthread to whom I owe replies. A lurker has very kindly supplied me with material (including Ben-Naim’s Entropy Demystified.) It seems more useful to work through that than continue here.

    I agree. He’s another expert who agrees with keiths right up to the point where he doesn’t. keiths just can’t get that Damon is a perfect reductio of his position, but as Jaynes and Ben-Naim do understand this point (and I think Jock does too), it probably makes sense to ignore keiths’ view on this–as on so many other–issues.

    Oh, and we should also keep in mind that posting remarks by authorities is inappropriate, unless they are Gell-Mann, or somebody else keiths can find who might agree with him about something (at least if nobody looks too closely).

    Einstein–Not to be cited
    Jaynes–Not to be cited
    Ben-Naim–Not to be cited
    Lambert–Not to be cited
    Gell-Mann–Appropriate to cite

    Got it.

  6. Regarding observer dependence, I will argue that we have to be careful to say it does not mean subjective. It means more “convention” or “idealized instrument” dependence.

    Temperatures are idealized instruments to make measurements of average energy per particle (with some qualification). In the case of monatomic ideal gases, temperature can be expressed in terms of kinetic energy per particle, i.e. in Joules (or typically in electron volts).

    So if we carry out entropy measurements with thermometers, the uncertainty in measurement is a matter of principle, not subjectivity. If I gave an average energy of particles of say 5.5 million electron volts (8.81 x 10-13 Joules), I have no details about the exact values of energy of every atom. Hence we have uncertainty as to which microstate the entire system is in.

    Suppose we had the following table of the atoms in 2 configurations or “microstates”.

    The atoms are labeled A, B, C, D, E, F, G, H, I and J. Each state lists the energy in millions of electron volts.

    Note that each microstate has an average energy per atome of 5.5 million electron volts even though the actual energies of each atom are different. The thermometer will only register an average of 5.5 million electron volts per particle, it will have uncertainty about the actual energy of each particle. This uncertainty in exact energy and other exact details constitutes entropy.


    microstate 1

    A 1
    B 2
    C 3
    D 4
    E 5
    F 6
    G 7
    H 8
    I 9
    J 10

    vs.


    microstate 2

    A 10
    B 9
    C 8
    D 7
    E 6
    F 5
    G 4
    H 3
    I 2
    J 1

    I should note this is a toy example. Real distributions of energy will obey:
    https://en.wikipedia.org/wiki/Maxwell%E2%80%93Boltzmann_distribution

    But the idea is at any moment the configuration of energy will change for each atom, and each re-distribution is a microstate. Our thermometer however will be oblivious to the exact details of the positon and momentum of each particle at each instant.

    Now if we could violate the Heisenberg principle and have exact position and momentum data at every instant and further predict the trajectory of each particle with infinite accuracy, we would have zero entropy since Q-reversible would be zero in principle. This was the infamous Maxwell-Demon paradox.

    I reported on a famous paper that showed entropy would be zero if we were omniscient (note the author Leo Szillard sitting by Einstein):

    Landmark 1929 Physics Paper: On the Decrease of Entropy in a Thermodynamic System by the Intervention of Intelligent Beings

    That’s probably why Jaynes said if we perfectly define entropy such that we have all the details of the system (omniscient), thermodynamic entropy would effectively collapse.

    If one is omniscient, there is no uncertainty, and hence no possible increase in knowledge, hence no entropy. But this is bordering on a Captain Obvious conclusion.

  7. My point is a bit different, Sal. The Shannon entropy is still calculable at the limit. It’s zero. But it’s not thermodynamics anymore. Not all information is relevant. That’s what I think Jaynes et al are saying. Thermodynmics involves a particular set of variables; and one may take as axiomatic both that themodynamic entropy is zero at zero kelvin and likely to be increasing otherwise. Both of those may fail for Damon. Not because themodynamic entropy doesn’t measure information, but because it doesn’t measure ALL information. What info does it measure? As Einstein said, it depends on one’s physical theory. Damon’s info is not constrained: it’s like jamming statistics books with dicta regarding the sweatiness of the coin tosser’s palms. It’s neither that such info is false or that it has no effect on the probability of the results. It’s that it’s not what’s being talked about. It’s like bringing up the effect of prayer on hurricane developments.

  8. The Shannon entropy is still calculable at the limit.

    I was unaware of this. You’re going into territory that’s outside of my knowledge base. Thanks for pointing this out.

  9. walto,

    Oh, and we should also keep in mind that posting remarks by authorities is inappropriate, unless they are Gell-Mann, or somebody else keiths can find who might agree with him about something (at least if nobody looks too closely).

    Einstein–Not to be cited
    Jaynes–Not to be cited
    Ben-Naim–Not to be cited
    Lambert–Not to be cited
    Gell-Mann–Appropriate to cite

    Got it.

    That’s pitiful, walto.

    The problem isn’t with citations, but with bogus arguments from authority. Particularly in this case, where you didn’t understand what the authority was arguing and failed to notice that he was undermining your position.

  10. keiths:
    walto,

    That’s pitiful, walto.

    The problem isn’t with citations, but with bogus arguments from authority.Particularly in this case, where you didn’t understand what the authority was arguing and failed to notice that he was undermining your position.

    Got it–they’re bogus just in case they disagree with you. And, btw, it’s clear that Jaynes does. At the limit it’s NOT themodynamics.

    To put it in Jock’s terms–Maxwell demon–Yes; Laplace demon (i.e. yours)–No.

  11. Try to be honest about this for a change, keiths. That Jaynes quote agrees with me, not you.

    Fortunately, you don’t care about such trivialities as whether your views comport with those of acknowledged experts and will continue to repeat them and attack those who disagree with them (in ANY respect!) over and over until the end of days.

    In that sense, it IS ‘deepity woo.’

  12. walto,

    Got it–they’re bogus just in case they disagree with you.

    Um, no.

    They’re bogus if they depend on the assumption that your authority cannot be wrong. Citing someone to make a point is perfectly fine, but you need to be able to defend that point.

    Here you cited an authority without even understanding what he was saying. Jaynes didn’t buttress your argument; he dismantled it. There is no single “true” entropy of a physical system:

    After the above insistence that any demonstration of the second law must involve the entropy as measured experimentally, it may come as a shock to realize that, nevertheless, thermodynamics knows of no such notion as the “entropy of a physical system.” …

    From this we see that entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

    Entropy is observer-dependent.

  13. Wrong, keiths.

    I agree with Jaynes’ quote in its entirety, including his claim that its measurement is observer-dependent. I also agree with him that at the Damonic limit it is not thermodynamics. You don’t.

    I think Jaynes is right and you are wrong. (Same for Einstein and Ben-Naim.)

  14. Sal,

    If one is omniscient, there is no uncertainty, and hence no possible increase in knowledge, hence no entropy. But this is bordering on a Captain Obvious conclusion.

    Obvious to you and me, but walto somehow thinks it’s a reductio:

    keiths just can’t get that Damon is a perfect reductio of his position…

    walto,

    If it’s a reductio, where is the absurdum?

    Xavier, Yolanda and Damon see different entropies because for each of them, entropy is a measure of the missing information between their particular macrostate and the exact microstate.

    In Damon’s case, there is no missing information. The entropy is therefore zero.

    Where’s the absurdum?

  15. keiths: Where’s the absurdum?

    Asked and answered about fifty times. It’s absurd to claim that that calculation is of THERMODYNAMIC entropy. And why it’s important to you that it should be is, if not hilarious, at least wicked weird.

    For Damon, entropy measurements are ALWAYS zero. The claim that those are thermodynamic estimates is absurd.

  16. walto,

    It’s absurd to claim that that calculation is of THERMODYNAMIC entropy.

    So a thermodynamic entropy of 0.00…01 bit is fine, but zero is beyond the pale?

    Would you say the same of statistics? That a probability of .99…99 is fine, but a probability of 1.0 is beyond the pale?

    You’re placing an undue amount of importance on that last .00…01, don’t you think?

  17. walto,

    For Damon, entropy measurements are ALWAYS zero. The claim that those are thermodynamic estimates is absurd.

    They aren’t estimates. For Damon, the entropy really is zero.

    After fighting it tooth and nail, you now accept that entropy is a measure of missing information. Why do you think that the missing information can’t be zero in Damon’s case? He knows the exact microstate.

    Are you proposing that we redefine entropy to be “a measure of the missing information, but only if it’s not zero”?

  18. walto,

    I agree with Jaynes’ quote in its entirety, including his claim that its measurement is observer-dependent.

    He doesn’t merely claim that the measurement is observer-dependent. He says that entropy itself is observer-dependent:

    After the above insistence that any demonstration of the second law must involve the entropy as measured experimentally, it may come as a shock to realize that, nevertheless, thermodynamics knows of no such notion as the “entropy of a physical system”…

    From this we see that entropy is an anthropomorphic concept, not only in the well-known statistical sense that it measures the extent of human ignorance as to the microstate. Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

    You’ve already reversed yourself on dispersalism and the missing information view. Why not take this final step, reverse yourself again, and embrace the observer-dependence of entropy?

  19. keiths: After fighting it tooth and nail, you now accept that entropy is a measure of missing information. Why do you think that the missing information can’t be zero in Damon’s case? He knows the exact microstate.

    Are you proposing that we redefine entropy to be “a measure of the missing information, but only if it’s not zero”?

    No no no no no. There’s no missing info for Damon, who knows everything about everything; It’s not the .00001 biz which has nothing to do with anything I’ve written; Damon’s estimate is perfectly accurate and the entropy he calculates is exactly zero. But, as Jaynes and others point out, it’s not thermodynamic entropy. To be be relevant to thermodynamics, information must be constrained by the science. E.g., thermodynamic entropy must be zero at zero Kelvin, but nothing requires Damonic entropy to be zero at that temp. Only certain variables (having to do with energy, volume, heat, etc.) matter.

    Go back and read my posts again more slowly. I’m tired of repeating this stuff.

  20. keiths: You’ve already reversed yourself on dispersalism and the missing information view. Why not take this final step, reverse yourself again, and embrace the observer-dependence of entropy?

    I already have. As I’ve indicated at least twice, I was convinced by Adami and others of this some time ago. (Your posts and Jock’s played a role too.) What YOU need to do is take the FIRST step to the recognition that Shannon entropy is not identical to thermodynamic entropy. Then everybody will agree and we can all go home and have the PBJs I promised.

  21. keiths:

    You’ve already reversed yourself on dispersalism and the missing information view. Why not take this final step, reverse yourself again, and embrace the observer-dependence of entropy?

    walto:

    I already have. As I’ve indicated at least twice, I was convinced by Adami and others of this some time ago. (Your posts and Jock’s played a role too.)

    You accepted the missing information view, but not observer-dependence. The following comment of yours came after your last mention of Adami:

    One other thing worth noting: This is a false quadiotomy regarding thermodymic entropy:

    1. Xavier is right when he calculates it.
    2. Yoland is right when she calculates it.
    3. Damon is right using all the information in the universe about everything and assuming whatever he doesn’t know about every microstate is zero.
    4. We can say precisely what the correct one is.

    We get closer as our equipment gets better, but, because of the limitations of measurement abilities, we can never get it precisely “right.”

    You were still claiming that there was a single “right” answer at that point.

    Just to be absolutely clear: Do you now accept that entropy itself, and not just its measurement, is observer-dependent? Do you accept that Xavier and Yolanda are both assessing entropy (and entropy change) correctly, despite coming up with different answers?

    What YOU need to do is take the FIRST step to the recognition that Shannon entropy is not identical to thermodynamic entropy.

    I’ve never claimed that they are identical. Thermodynamic entropy is a particular kind of Shannon entropy, but there are many kinds of Shannon entropy that are not thermodynamic.

    Damon’s entropy is thermodynamic entropy. It is a measure of the gap in information between his thermodynamic macrostate and the exact thermodynamic microstate of the system. That gap is zero, so Damon sees a thermodynamic entropy of zero.

  22. keiths,

    Xavier, Yolanda and Damon see different entropies because for each of them, entropy is a measure of the missing information between their particular macro state and the exact micro state.

    Can you reconcile this with the second law of thermodynamics?

  23. colewd,

    Can you reconcile this with the second law of thermodynamics?

    Yes. The second law simply says that the entropy of an isolated system must either remain the same or increase. That’s true for Xavier, Yolanda, and even Damon.

    For Damon the entropy is always zero, so it falls into the “remains the same” case.

  24. keiths,

    Yes. The second law simply says that the entropy of an isolated system must either remain the same or increase. That’s true for Xavier, Yolanda, and even Damon.

    Can you state this with an example without using the word entropy?

  25. keiths: You were still claiming that there was a single “right” answer at that point.

    Just to be absolutely clear: Do you now accept that entropy itself, and not just its measurement, is observer-dependent? Do you accept that Xavier and Yolanda are both assessing entropy (and entropy change) correctly, despite coming up with different answers?

    Depends what you mean by “observer-dependent.” I still think there’s a sense in which Yolanda and Xavier are not “both right.” Their calculations may both be correct, however, because they’re measuring their own ignorance. But it’s also misleading to say they’re both right, unless we explain what we mean. If, for example, either calculates a thermodynamic entropy that is violative of any of the laws, I’d say they will have been wrong about the thermodynamics of the macrostate, regardless of the states of their own limitations.

    Again, it’s like two weight measurements. They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark. The correct measure is of OUR ignorance, science’s–not yours or mine–up to some ideal, unachievable limit (which is not Damon’s, because he’s left thermodynamics and entered Laplaceville).

  26. stcordova: The thermodynamic entropy in J/K can be converted to bits by simply dividing by Boltzman’s constant and then converting the natural log measure to log-base-2 measure.

    And bits are the unit of measure for what, Salvador?

    You either see the connection between entropy and information or you don’t. You don’t get to have it both ways. For as you also stated in the OP:

    As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

    My answer: Nowhere!

  27. keiths: …but no one has produced even a single scenario in which the “missing information” interpretation fails.

    The “missing information” interpretation fails in at least two cases:

    1.) When there is no “missing information.”
    2.) When there is no observer.

  28. keiths: There is no single “true” entropy of a physical system. It depends on the experiments chosen by the observer; that is, it depends on the observer’s choice of macrostate.

    Jaynes: After the above insistence that any demonstration of the second law must involve the entropy as measured experimentally, it may come as a shock to realize that, nevertheless, thermodynamics knows of no such notion as the “entropy of a physical system.”

  29. Walto, I have to ask why you are convinced there is a single correct value for entropy.

    From its inception, the concept of entropy has been operationally defined. The “Law” has been a formula, not a theory. this is an excellent example of a theory being a higher level concept than a law.

  30. keiths: You’re placing an undue amount of importance on that last .00…01, don’t you think?

    Is that a picobit? I wasn’t aware that there could be fractions of a bit. Learn something new every day.

  31. petrushka: Walto, I have to ask why you are convinced there is a single correct value for entropy.

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for. But if we decide what parameters are of interest, settle, even for a moment on one particular set we’re interested in, there aren’t 10000000 different correct answers, depending on who’s looking for it. Entropy is (entropies are, really) not subjective. In other words, there is entropy1, entropy2,….entropyn. And a correct value for each, up to some level of accuracy.

    There is but one entropy for keiths’ Damon, however. So you probably meant to direct your question to him.

  32. I don’t have a problem with keiths’ interpretation.

    Unless he tells me I’m wrong, his definition is operational. I’m trying to understand why you object to that.

  33. It leaves the term “information” unconstrained and so, useless.

    I doubt you’ve read many of the posts here, petrushka, or you wouldn’t be asking this. I really don’t want to have to repeat them. keiths enjoys reposting what he’s already said several hundred times; I don’t.

  34. walto: It leaves the term “information” unconstrained and so, useless.

    Any definition of information that isn’t operational is useless.

    True, I haven’t read every post. What is the point of listening to a stuck record (applies to all sides in this discussion) for days on end?

    I asked you a rather simple and limited question.

  35. walto: Damon’s is not operational.

    Nor realizable. But thought experiments reveal problems with concepts.

    Relativity, both special and general, use thought experiments to illustrate the inadequacy of classical understandings. Short formulas can be perfectly acceptable to engineers and are usually easier to implement, and yet fail in various scenarios.

  36. petrushka: Any definition of information that isn’t operational is useless.

    petrushka: walto: Damon’s is not operational.

    Nor realizable. But thought experiments reveal problems with concepts.

    Relativity, both special and general, use thought experiments to illustrate the inadequacy of classical understandings. Short formulas can be perfectly acceptable to engineers and are usually easier to implement, and yet fail in various scenarios.

    So….is that non-operational definition useless or not useless?

  37. keiths: The checking account balance is analogous to the total entropy of the universe.

    So you agree that the total entropy of the universe can decrease, even to the point of having a zero entropy balance?

    If the checking account balance is analogous to the total entropy of the universe, what is the deposit analogous to? Mass/energy entering the universe from outside?

    In my thinking your bank account was supposed to be analogous to a thermodynamic system. As such, the balance is a property of the account. Now if you say the balance is analogous to the entropy of the universe then it seems to be to follow that the account itself is the universe.

    This means you are claiming the the universe itself is a thermodynamic system. Can you defend that claim?

  38. walto: So….is that non-operational definition useless or not useless?

    What non-operational definition? Thought experiments are not definitions. In the cases I’m discussing, they are extrapolations of formulas to extreme conditions. A concept that doesn’t work at extreme conditions may be useful, but incomplete.

  39. Ok, I have no idea what you’re saying now. You wrote this:

    petrushka: don’t have a problem with keiths’ interpretation.

    Unless he tells me I’m wrong, his definition is operational.

    I indicated that I don’t think his definition of information is operational. But if, as you write, HE has to tell you this, then I have no idea why I’m responding at all.

    May you enjoy your intercourse with him more than any of the rest of us has!

  40. I don’t know who you include in “us”.

    I’m not interested in what Sal says, or FMM.

    I’m interested in what DNA_Jock says, and I pay attention to the responses to mung.

  41. Walto:

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for. But if we decide what parameters are of interest, settle, even for a moment on one particular set we’re interested in, there aren’t 10000000 different correct answers, depending on who’s looking for it. Entropy is (entropies are, really) not subjective. In other words, there is entropy1, entropy2,….entropyn. And a correct value for each, up to some level of accuracy.

    Agree.

    There is usually a default entropy assumed, like the one that is of interest to calculating the Clausius entropy which is related to the Clausius definition of thermodynamics where the major variables are temperature and heat and quantities of substance.

    Standard Molar Entropies that chemists and engineers use obviously have some implicit definition of which entropy they are talking about. I’ve used those in many of my calculations.

  42. What is the point of listening to a stuck record (applies to all sides in this discussion) for days on end?

    I’ve given many examples of entorpy calculations for gases, for phase change of solid to liquid. I provided the angles from clausius, sakur-tetrode, boltzmann, shannon. I described microstates in terms of X,Y,Z, P_x, P_y, P_z.

    Hopefully you actually learned a little of calculating entropy for college level quizzes and exams.

    I can’t say I learned anything from Keiths, Mung, or DNA_jock, and hence put them on my ignore list 500 comments ago.

    I did enjoy reading Walto and colewd’s comments.

  43. I think I’ve learned something from everybody who has posted on this thread…except maybe petrushka.

  44. walto,

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for. But if we decide what parameters are of interest, settle, even for a moment on one particular set we’re interested in, there aren’t 10000000 different correct answers, depending on who’s looking for it. Entropy is (entropies are, really) not subjective.

    Right. That’s why I’ve been careful to use “observer-dependent” and not “subjective” when describing entropy in this thread.

    The choice of macrostate is a function of the observer and the observer’s information. Once the macrostate has been established, the entropy follows objectively. Xavier’s entropy is objectively correct for his choice of macrostate. Ditto for Yolanda and Damon.

    The observer dependence comes in via the selection of the macrostate, not via the calculation of entropy from the selected microstate. That’s why there can be tables of standard molar entropies, as Sal points out, and why there can be right and wrong answers to questions involving entropy on a thermodynamics exam.

    Entropy is observer-dependent because it depends on the observer’s choice of macrostate. As Jaynes says:

    Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

Leave a Reply