In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths: Says the guy who just mentioned Shannon entropy.

    Given what I have said previously in this thread about Shannon entropy, what’s the problem? Should we just ignore my stated position?

  2. The world has moved on, Mung. Shannon entropy is entropy. So is thermodynamic entropy.

    Get used to it.

    Eigen is right:

    An entropy can be assigned to any probability distribution.

  3. keiths: He didn’t choose the microstate. Read it again, noting the words in bold:

    He chose to determine the exact microstate. His macrostate has an ensemble of one single microstate.

    So there was only one exact microstate, regardless of his choice to determine the exact microstate?

    I must admit, I am still confused.

    That Damon had a choice implies that there were alternative microstates.

    That Damon chose to determine the exact microstate implies that Damon did not know the exact microstate.

    How Damon’s choice to determine the microstate affected the actual physics remains a mystery.

    keiths, you were doing so well in this thread. Don’t let it all unravel.

  4. keiths: The world has moved on, Mung. Shannon entropy is entropy. So is thermodynamic entropy.

    Shannon entropy is not entropy. Shannon entropy is not thermodynamic entropy. Thermodynamic entropy is entropy. The world moves on, regardless.

  5. Mung,

    So there was only one exact microstate, regardless of his choice to determine the exact microstate?

    Yes. A thermodynamic system is in one microstate at a time. You didn’t realize that?

    I must admit, I am still confused.

    I must admit, I am not surprised.

    That Damon had a choice implies that there were alternative microstates.

    No. The choice was whether to determine the exact microstate.

    That Damon chose to determine the exact microstate implies that Damon did not know the exact microstate.

    Yes, unless you believe that Laplacean demons can never opt out of knowing the exact microstate of every system they consider. In that case the macrostate would be forced on him rather than being chosen, but with the same result: an entropy of zero.

    How Damon’s choice to determine the microstate affected the actual physics remains a mystery.

    It didn’t affect the physics. Where do you get the strange idea that it did?

  6. keiths:

    Think of it this way: Every macrostate is associated with a set of possible microstates. When you establish a macrostate, you’re essentially saying “I know that the actual microstate must be one of these M microstates.” The less information you have about the system, the bigger M is. The more information you have, the smaller M is.

    If you’re Damon the Demon, then M is 1. There’s only one epistemically possible microstate, so the entropy is zero.

    Mung:

    X[avier] and Y[olanda] disagree.

    Epistemic possibility, like epistemic probability, is observer-relative. Just like entropy.

  7. keiths: The choice was whether to determine the exact microstate.

    According to you, there was only one possible microstate. Else Damon the Demon would be left to determine the actual microstate just like anyone else, by asking yes/no questions.

  8. Mung,

    According to you, there was only one possible microstate.

    One metaphysically possible microstate. Epistemic possibility is another kettle of fish entirely — and an observer-dependent one.

  9. keiths: One metaphysically possible microstate. Epistemic possibility is another kettle of fish entirely — and an observer-dependent one.

    Where’s Damon?

  10. Jesus, Mung.

    When Damon exercises his observational superpowers, the number of epistemically possible microstates drops to one, because he determines the actual microstate. Epistemic possibility matches metaphysical possibility at that point.

    I explained it already:

    Think of it this way: Every macrostate is associated with a set of possible microstates. When you establish a macrostate, you’re essentially saying “I know that the actual microstate must be one of these M microstates.” The less information you have about the system, the bigger M is. The more information you have, the smaller M is.

    If you’re Damon the Demon, then M is 1. There’s only one epistemically possible microstate, so the entropy is zero.

    For Xavier and Yolanda, M is much greater than 1. There are many epistemically possible microstates, so the entropy is nonzero.

  11. keiths: For Xavier and Yolanda, M is much greater than 1. There are many epistemically possible microstates, so the entropy is nonzero.

    I missed the derivation of ‘M’.

    For Damon, there are many epistemically possible microstates, so the entropy is nonzero.

  12. Mung,

    I missed the derivation of ‘M’.

    There is no single formula. It depends on the system and what the observer knows about it. See if you can figure out how I did it in the “odds before evens” card deck case, where M was equal to 12.

    For Damon, there are many epistemically possible microstates, so the entropy is nonzero.

    No. Once he has determined the exact microstate at time t, it is the only epistemically possible state at time t for him.

  13. Hi colewd:

    I really think you have a great handle on the issues here. What’s interesting is the Physicists like Berkenstein, Penrose and Hawking used the Shannon/Boltzman information to support their theories of cosmology. Without real experimental backup their conclusions become very suspect. An example is Berkenstein’s calculation of the entropy of a black hole.

    Hi,

    Thanks for the kind words.

    I should mention, and this is fairly important for historical reasons.

    Boltzmann and Gibbs formulation of statistical mechanics was originally Newtonian (the proper word is “classical”), not quantum mechanical nor relativisitic. Even today, much of Boltzmann/like definitions of statistical mechanics are first described in classical terms rather than quantum mechanical terms. The more accurate of course is quantum mechanical at the molecular level, but classical approaches are still used for many molecular interactions.

    There are a relativistic versions that were pioneered by Bekenstein and Hawking and astrophysicists trying to describe entropy in situations were general relativity plays a role. To my knowledge, since we haven’t been able to successfully have a unified theory that both includes quantum mechanics (very small scale) and general relativity (large scale and) and special relativity (high speed), I don’t know personally of any statistical mechanics that incorporates quantum mechanics and relativity simultaneously.

    Much of the statistical mechanics is classical and quantum (as needed).

    So this puts twist on much of this discussion. There are two major schools of defining entropy, the most dominant from a practical standpoint is Clausius, but the most comprehensive is Boltzmann, but there are qualifications. We might classify the entropy approaches as:

    1. Clausius

    2. Boltzmann/Gibbs under Newtonian (Classical) mechanics, which is the original Boltzmann

    3. Boltzmann under quantum mechanics, which is modernization of Boltzmann

    4. Boltzmann under General Relativity yet another modernization of Boltzmann for the sake of studying Black holes and cosmology and astrophysics.

    Add to all this, Shannon adapted Boltzmann’s equations to Shannon’s version of information theory. Oh, that’s another thing, there are multiple definitions of information theory, but let’s not open that can of worms right now.

  14. I’ll give a qualitative description of entropy under Boltzmann and information theory and try to relate it to Clausius.

    The Clausius approach to thermodynamics was very practical and less theoretical. The desire was to make better steam engines for things like trains and boats or whatever. It is amazing that we were able to build so many things with little in the way of the theoretical knowledge we have today..

    I just found this on Wiki:

    https://en.wikipedia.org/wiki/Steam_engine

    As the development of steam engines progressed through the 18th century, various attempts were made to apply them to road and railway use.[35] In 1784, William Murdoch, a Scottish inventor, built a prototype steam road locomotive.[36] An early working model of a steam rail locomotive was designed and constructed by steamboat pioneer John Fitch in the United States probably during the 1780s or 1790s.[37] His steam locomotive used interior bladed wheels guided by rails or tracks.

    The first full-scale working railway steam locomotive was built by Richard Trevithick in the United Kingdom and, on 21 February 1804, the world’s first railway journey took place as Trevithick’s unnamed steam locomotive hauled a train along the tramway from the Pen-y-darren ironworks, near Merthyr Tydfil to Abercynon in south Wales.[35][38][39] The design incorporated a number of important innovations that included using high-pressure steam which reduced the weight of the engine and increased its efficiency. Trevithick visited the Newcastle area later in 1804 and the colliery railways in north-east England became the leading centre for experimentation and development of steam locomotives.[40]

    Clausius published his ideas of entropy in 1865, so this was after steam engines were already in use. It should be noteworthy that Clausius was a student (I don’t know if a proponent) of the now discredited Caloric heat theory, which postulated heat was a fluid that flowed between objects. Clausus definition of entropy works with Caloric heat theory. See:

    https://en.wikipedia.org/wiki/Caloric_theory

    Rumford’s experiment inspired the work of James Prescott Joule and others towards the middle of the 19th century. In 1850, Rudolf Clausius published a paper showing that the two theories were indeed compatible, as long as the calorists’ principle of the conservation of heat was replaced by a principle of conservation of energy. In this way, the caloric theory was absorbed into the annals of physics, and evolved into modern thermodynamics, in which heat may formally be put equivalent to kinetic energy of some particles (atoms, molecules) of the substance.

    So as far Clausius was concerned and the applications entropy was used for, the existence of atoms vs. caloric heat was not immediately relevant!

    In contrast, Boltzmann and Gibbs however believed that matter was made of atoms, and the atomic view was often greeted with extreme hostility. They postulated that if matter is made of and/or molecules and that if at atoms/molecules behaved like billiard balls under classical (Newtonian) mechanics then the phenomenon of heat and temperature could be described as the statistical net result of the mechanical action of buzzillions of these particles.

    So at the root of their theories is the statistical properties of the mechanical behavior of a large ensemble of particles, hence the phrase STATISTICAL MECHANICS.

    The important connection that Boltzmann and Gibbs made to physics is relating the statistical properties of a large ensemble of atomic/molecular particles to what we experience as heat and temperature.

    Up until that point, the fields of thermodynamics (as in heat and temperature) seemed a separate field than classical mechanics, but Boltzmann and Gibbs bridged the disciplines and help unify thermodynamics with Newtonian mechanics.

    Now, a caveat, the advent of classical mechanics, photonics, etc. puts a wrinkle in all this. But at the time this particular unification of Clausius heat/energy approach and Newtonian mechanics is on par with Einstein relating mass to energy and the speed of light:

    E = m c^2

    What Boltzmann and Gibbs did is relate statistical mechanics to to Clausius entropy. I can state it somewhat formally:

    Clausius version of entropy:

    delta-S = Integral (dq/T)

    Boltzmann version of entropy:

    S = k ln W

    which implies under Boltzmann’s definition

    delta-S = k ( ln W_final – ln W_initial)

    Unifying relation (under certain circumstances) of Clausius and Boltzmann:

    delta-S = Integral (dq/T) = k (ln W_final) – k (ln W_initial)

    So I could in principle solve the melting ice cube problem either under Clausius or Boltzmann, but trying to apply Boltzmann to melting ice cubes is a night mare.

    One can use Clausius or Boltzmann to calculate entropy ideal gasses, but even then the Boltzmann approach though doable, is a bit of headache, especially if quantum mechanics is added.

    I illustrated how this is done here using a formula worked out by Sakur and Tetrode from Boltzmann’s equation, but even then it’s like hitting your head with a brick when the Clausius approach (energy dispersal) is easier:

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-8/#comment-145774

    It’s rather painful to derive Sakur-Tetrode from first principles of Boltzmann and quantum mechanics, so if I showed the full derivation it would be majorly ugly. Yikes!

    But if we treat Clausius as a first principle, the calculations are easier for most engineering and chemical applications.

    Because there such radically different approaches to measuring and calculating entropy, it results in lot of confusion.

  15. keiths: Eigen is right. Entropy is the superset, and thermodynamic entropy is a subset. It’s just one kind of entropy.

    Mung: Eigen is wrong. SMI is the superset and thermodynamic entropy is a subset of SMI. Thermodynamic entropy is the only kind of entropy.

    keiths: Thermodynamic entropy came first historically, but that’s not a good reason to grant it exclusive rights to the word “entropy” without the qualifier.

    mung: Thermodynamic entropy came first historically, and that’s a good reason to grant it exclusive rights to the word “entropy,” how it is used in thermodynamics.

    This is what I meant when I said the issue is largely a quibble.

    I don’t think it is entirely, though. Mung’s posts have reflected the intuition I’ve expressed that my cat (objectively) has a particular entropy regardless of how it is specified or happens to be measured. How can this be if entropy is a quantity of missing information and information is relative to what some observer knows?

    The answer, I think, lies in what I’ve said (inarticulately grunted, actually) about “constraints.” There’s always an infinite amount of knowledge that may be obtained about any macrostate, as well as an infinite number of perspectives that may be considered relevant to the measurement of the information missing. I think all parties will agree that thermodynamic entropy restricts the information that is considered relevant to such matters as temperature and energy. (For example, one may or may not know how much Beverly enjoys some particular macrostate, or how much that state resembles some macrostate on Jupiter, but neither of those are relevant here, so they’re not considered in the calculation of thermodynamic entropy. So far, I think there will be agreement from all parties. The Boltzmann and Gibbs definitions don’t reflect an interest in ALL information–only in information relevant to the phenomenological features of the world that Clausius was writing about.

    Where there apparently remains disagreement is that I ALSO think that one may throw out some information that one such as Damon has that IS relevant to the arrangements of microstates, because I take some information to nevertheless be irrelevant to the thermodynamic calculation. Well, why isn’t it relevant? It certain SEEMS to be, no?

    Put it this way. keiths has asked what the absolutely correct measurement it, if it is not measuring Xavier’s, Yolanda’s or Damon’s ignorance. Nobody can answer that, I think, because the tools that are used may be improved without limit, and still not reach anything like Damon’s zero calculation. I think this apparent paradox is resolved by recognizing that Damon is considering things that may not be relevant to a thermodynamic entropy calculation, even if his knowledge entails a particular position and momentum of every included particle.

    Suppose, for example, that Damon’s brother, Ramon, also calculates the statististical entropies of every macrostate as zero, simply because a number pops into his head that is always right. Both Ramon and Damon always get zero, no matter whether the cat they’re looking at is dead or alive. The problem is that 2LOT implies that the thermodynamic investigation MUST produce a higher entropy for the dead cat (as a closed system) unless we’re at zero Kelvin. That doesn’t mean that the thermodynamic investigation is not also an observer-relative quantification of missing information. I think those insisting on an objective result can concede that to the information theoretic view. It simply means that not ALL information can be relevant to the thermodynamics calculation. The chemist doesn’t need to care what number is popping into Ramon’s head at every moment. That’s not the sort of thing s/he’s interested in.

    That’s probably not much better than what I’ve said before, but maybe it gives some sense of what I mean by “constrained” or “hybrid.” In a word, I agree with much of what keiths has written on this thread, but I still think Damon answers provide a reductio of the claim that statistical entropy is identical to thermodynamic entropy. For Shannon et al., everything matters. But I think for Sal, Lambert, and the chemists, some apparently relevant information must be tossed out, because, e.g., the number of degrees of freedom of the bodies is not intended to be further restricted by conceivable laws of nature (or Ramon’s dumb luck).

    It’s like the consideration of a regular two headed coin. In our statistics books, we don’t find discussions of the various forces applying to the coin tosses. That stuff is irrelevant in that context: if it’s a fair coin, there are two possibilities, each with a probability of .5 of obtaining. Period. The nature of the discourse in the statistics text in this way “constrains” the information considered to be relevant. When we calculate probabilities for our stats profs, we intentionally IGNORE the barometric pressure and whether the coin thrower has sweaty palms.

    Similarly, IMO, while the CORRECT measurement of thermodynamic entropy can get (limitlessly) better than Yolanda’s estimate because there’s no limit to how much better her measuring equipment can get, except when the macrostate she’s considering is at absolute zero, her result, though approaching what I’d call the “correct” answer, will never be anything like Damon’. And this is true no matter how awesome her equipment gets (even if she’s using the Thermocalculatron 3000!). This is not because Damon is in error, but because he’s measuring statistical entropy, which does not constrain the relevant information, and she’s tossing out information that is not relevant to what we might call the thermodynamic attitude to this question. Not only is she’s not measuring EVERYTHING. She’s not even measuring everything relevant to possible arrangements of microstates. She’s considering only some subset of information (laws and conditions) that produce those results. Her measurements get better and better–closer and closer to the correct answer–but they will not generally be anything like Damon’s or Ramon’s. Those guys’ answers will only be correct at zero Kelvin–because, in a sense, they know too much.

  16. walto: For Shannon et al., everything matters.

    Just one note. Even for Shannon information it’s only the probability distribution that matters. Even if you don’t know the distribution you can select the one that will maximize the SMI and use that.

    Maximum Entropy Principle

    This pdf has some cool links.

  17. Mung: Just one note. Even for Shannon information it’s only the probability distribution that matters

    Right, but ANYTHING that affects the probability distribution matters. So, e.g., for Damon, there’s a zero pct. probability of anything happening that he’s not expecting.

  18. Here are some articles that I think set forth the constraints I have in mind for the statistical information to be thermodynamic:

    https://www.eng.fsu.edu/~dommelen/quantum/style_a/qsbv.html
    https://en.wikipedia.org/wiki/Thermodynamic_system
    https://en.wikipedia.org/wiki/List_of_thermodynamic_properties
    https://en.wikibooks.org/wiki/Engineering_Thermodynamics/Thermodynamic_Systems
    https://en.wikipedia.org/wiki/State_function

    A thermodynamic system is the material and radiative content of a macroscopic volume in space, that can be adequately described by thermodynamic state variables such as temperature, entropy, internal energy and pressure.
    **********************************************************
    List of state functions
    See also: List of thermodynamic properties

    The following are considered to be state functions in thermodynamics:

    Mass
    Energy (E)
    Enthalpy (H)
    Internal energy (U)
    Gibbs free energy (G)
    Helmholtz free energy (F)
    Exergy (B)
    Entropy (S)

    Pressure (P)
    Temperature (T)
    Volume (V)
    Chemical composition
    Specific volume (v) or its reciprocal Density (ρ)
    Fugacity
    Altitude
    Particle number (ni)

    So, what I’m trying to say is that, to the extent that anybody’s information extends beyond what can be expressed by assigning values to those variables, it’s not thermodynamic information. Shannon information is not constrained in that fashion. keith has already said that he agrees that thermodynamic entropy is a subset of Shannon entropy. But I think it also follows that when Damon says “zero”–he’s not talking about thermodynamic entropy at all.

  19. walto,

    Much of your confusion could be dispelled if you would just keep the following in mind:

    1. A system is in one microstate at any given moment.

    2. The macrostate is essentially a summary of what is known about which microstate the system is in.

    3. The macrostate rules out some microstates and renders others possible, with varying probabilities. In other words, it establishes a probability distribution over the microstates.

    4. The probability distribution is epistemic. The system is always in exactly one microstate, but we (except for Damon) don’t know which one. The probability distribution represents our uncertainty — our missing information.

    5. The entropy is essentially a distillation of that probability distribution — that missing information — into a single number.

    6. For an observer like Damon who knows the exact microstate, there is no missing information. He knows precisely which microstate the system is in, with a probability of 1. Therefore the entropy is zero for him.

    7. There is a range of possible observers with progressively more missing information than Damon. Those with small amounts of missing information will calculate small entropies; those with large amounts of missing information will calculate large entropies.

    8. There is no single “correct” value of entropy because entropy is not a property of the system by itself. It’s a measurement of the missing information, which depends on both the system and on the observer. It can vary from observer to observer.

    Later today, I’ll address your comments above in light of these points.

  20. keiths:
    walto,

    Much of your confusion could be dispelled if you would just keep the following in mind:

    1. A system is in one microstate at any given moment.

    2. The macrostate is essentially a summary of what is known about which microstate the system is in.

    3. The macrostate rules out some microstates and renders others possible, with varying probabilities.In other words, it establishes a probability distribution over the microstates.

    4. The probability distribution is epistemic.The system is always in exactly one microstate, but we (except for Damon) don’t know which one. The probability distribution represents our uncertainty — our missing information.

    5. The entropy is essentially a distillation of that probability distribution — that missing information — into a single number.

    6. For an observer like Damon who knows the exact microstate, there is no missing information.He knows precisely which microstate the system is in, with a probability of 1. Therefore the entropy is zero for him.

    7. There is a range of possible observers with progressively more missing information than Damon.Those with small amounts of missing information will calculate small entropies; those with large amounts of missing information will calculate large entropies.

    8. There is no single “correct” value of entropy because entropy is not a property of the system by itself.It’s a measurement of the missing information, which depends on both the system and on the observer. It can vary from observer to observer.

    Later today, I’ll address your comments above in light of these points.

    None of that has a single thing to do with anything I wrote. Now THAT’S confusion!

  21. keiths:
    That you don’t see the connection is an indication of your confusion.

    Stay tuned.

    Well it’s connected in the sense that it’s almost completely consistent with what I wrote, if that’s what you mean. I think you have some imaginary adversary you’re bent on vanquishing. Instead of your customary attacks on dispersalism, I think it might be helpful for you to summarize what you think our differences are.

  22. walto,

    Well it’s connected in the sense that it’s almost completely consistent with what I wrote, if that’s what you mean. I think you have some imaginary adversary you’re bent on vanquishing.

    You can’t be serious.

    You’ve disagreed with me on every single one of the following points:

    1. Entropy is not a measure of energy dispersal.
    2. Entropy is a measure of missing information regarding the microstate.
    3. Entropy is observer-dependent.
    4. Entropy is not a function of the system alone.
    5. Entropy is a function of the macrostate.
    6. The entropy is zero for an observer who knows the exact microstate.
    7. The second law is not violated for such an observer.
    8. Stipulating an observer-independent entropy forces it to be zero for everyone, rendering it useless for entropy comparisons.
    9. The ‘missing information’ interpretation of entropy works in all cases.

    I’m sure I’d find more if I were to reread the thread.

    Take responsibility for what you write, walto.

  23. walto,

    So, what I’m trying to say is that, to the extent that anybody’s information extends beyond what can be expressed by assigning values to those variables, it’s not thermodynamic information.

    You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    Besides, even if that were right, there could still be more than one macrostate, and therefore more than one entropy, for a given system. Think of Xavier and Yolanda. Their macrostates are different, but both are correct, and both satisfy your unnecessary constraints regarding “thermodynamic information”.

    keith has already said that he agrees that thermodynamic entropy is a subset of Shannon entropy. But I think it also follows that when Damon says “zero”–he’s not talking about thermodynamic entropy at all.

    I explained this to Mung earlier:

    Mung:

    By the way, I don’t think this demon who can see all is relevant to entropy, which only applies to thermodynamic systems, which are macroscopic. IOW, as soon as you do away with the macroscopic/microscopic distinction you’re no longer talking thermodynamic entropy.

    keiths:

    The demon is just the limiting case. There’s a continuum of possible observers with differing amounts of missing information. As the missing information decreases, so does the entropy. When it hits zero (with Damon and his fellow demons), thermodynamics reduces to ordinary physics. Statistical mechanics becomes just plain mechanics, in other words.

  24. walto:

    Mung’s posts have reflected the intuition I’ve expressed that my cat (objectively) has a particular entropy regardless of how it is specified or happens to be measured.

    Your cat isn’t in an equilibrium state. I hope.

    How can this be if entropy is a quantity of missing information and information is relative to what some observer knows?

    The answer, I think, lies in what I’ve said (inarticulately grunted, actually) about “constraints.” There’s always an infinite amount of knowledge that may be obtained about any macrostate, as well as an infinite number of perspectives that may be considered relevant to the measurement of the information missing. I think all parties will agree that thermodynamic entropy restricts the information that is considered relevant to such matters as temperature and energy. (For example, one may or may not know how much Beverly enjoys some particular macrostate, or how much that state resembles some macrostate on Jupiter, but neither of those are relevant here, so they’re not considered in the calculation of thermodynamic entropy. So far, I think there will be agreement from all parties. The Boltzmann and Gibbs definitions don’t reflect an interest in ALL information–only in information relevant to the phenomenological features of the world that Clausius was writing about.

    Entropy is not just information, it’s the missing information between the macrostate and the exact microstate. Any information that narrows down the possible microstates is therefore relevant to entropy.

    Where there apparently remains disagreement is that I ALSO think that one may throw out some information that one such as Damon has that IS relevant to the arrangements of microstates, because I take some information to nevertheless be irrelevant to the thermodynamic calculation. Well, why isn’t it relevant? It certain SEEMS to be, no?

    The reason scientists focus on macroscopic variables like temperature and volume is not because any extra information is illegitimate — it’s because extra information is hard to get!

    Put it this way. keiths has asked what the absolutely correct measurement it, if it is not measuring Xavier’s, Yolanda’s or Damon’s ignorance. Nobody can answer that, I think, because the tools that are used may be improved without limit, and still not reach anything like Damon’s zero calculation. I think this apparent paradox is resolved by recognizing that Damon is considering things that may not be relevant to a thermodynamic entropy calculation, even if his knowledge entails a particular position and momentum of every included particle.

    He considers information that narrows down the possible microstates of a thermodynamic system to one. It’s relevant information about a thermodynamic system, so it’s relevant to entropy.

    Suppose, for example, that Damon’s brother, Ramon, also calculates the statististical entropies of every macrostate as zero, simply because a number pops into his head that is always right. Both Ramon and Damon always get zero, no matter whether the cat they’re looking at is dead or alive. The problem is that 2LOT implies that the thermodynamic investigation MUST produce a higher entropy for the dead cat (as a closed system) unless we’re at zero Kelvin.

    As I mentioned already, a living cat is not in equilibrium. Also, cats, whether dead or alive, are not closed systems. Setting all of that aside, you’re also assuming that death would invariably increase the entropy of a body. Not true, and the Second Law doesn’t require it.

    In a word, I agree with much of what keiths has written on this thread, but I still think Damon answers provide a reductio of the claim that statistical entropy is identical to thermodynamic entropy.

    I am not making that claim. Instead, I say that thermodynamic entropy is a subset of statistical entropy.

    Similarly, IMO, while the CORRECT measurement of thermodynamic entropy can get (limitlessly) better than Yolanda’s estimate because there’s no limit to how much better her measuring equipment can get, except when the macrostate she’s considering is at absolute zero, her result, though approaching what I’d call the “correct” answer, will never be anything like Damon’.

    As Yolanda’s information gets better and better, her answer approaches Damon’s. More information allows her to narrow down the possible microstates. Eventually there’s only one left and the entropy becomes zero for Yolanda just as it is for Damon.

  25. keiths,
    I’ve said what I agree with and what I disagree with about your position as clearly as I can. You’ve both mischaracterized my thoughts (such as they are) repeatedly and repeated your own views (many times). For what purpose, I don’t know. You like seeing your remarks in print over and over, I guess.

    In the words of Michael Palin, that’s not what I paid for! The only new remark I can find in our last couple of posts that suggest you may have read what I wrote is this:

    You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    And the answer is yes. I argued that some of it isn’t–just as some of the information that’s relevant to the probability of a coin coming up heads or a 3D object moving in a certain direction isn’t statistics. That was, basically the claim of my last couple posts. I have also provided a list from thermodynamics sites of state variables tha ARE relevant.

    I think the issue is largely a quibble, however, because I agree that it makes sense to consider thermodynamic entropy a subset of (informational) entropy. It’s thus a matter of very little importance in my opinion that Damon and Ramon are reductios when considering the thermodynamic subset. Their position remains in play in wider concepts of entropy.

    As it is extremely important to you that everyone agree with every iota of your posts, I’m sorry that you have found no one here to do that on this issue–as on the matter of ‘knowledge*.’ But I agree with the great majority of what you have said about thermodynamic entropy. I’m convinced, in particular, that It’s missing information of a certain kind. And I appreciate both your and Sal’s (and mung’s and jock’s) contributions on this thread.

    FWIW, I’ve also found Prof. Adami’s blogs on information and entropy helpful.

  26. keiths: There’s a continuum of possible observers with differing amounts of missing information. As the missing information decreases, so does the entropy.

    I would say that practically speaking the continuum ends just shy of Damon and that particular spot is the objective ideal.

    All the other perspectives are approximations of that ideal.

    Damon’s perspective is not available to us physical creatures even in theory.

    peace

  27. fifthmonarchyman,

    Whether or not you’re right about this (interesting) suggestion, for reasons already given, I’d restrict its application to some subset(s) of informational entropy that exclude the thermodynamic subset.

    Mike Elzinga: Asserting that entropy is about “lack of information” is a non-sequitur. Lack of information about what; you don’t know what microstate a system is in? How is that better than saying that the probabilities of states are distributed more uniformly? And have you not encountered Maxwell-Boltzman, Fermi-Dirac, and Bose-Einstein distributions? What is it that one doesn’t know?

    Mike Elzinga: [Thermodynamic entropy] is the logarithm of the number of energy microstates consistent with the macroscopic state of a thermodynamic system. The reason for the logarithm is because of how it changes with the total internal energy of the system and because it also makes the entropy extensive; namely it scales with the number of particles and with the volume of the system.

    Everything after that is figuring out how to enumerate those states. This is the really hard part. Counting may involve correlations that make the entropy of the parts of a system different from the entropy of the total. Those are technical details.

    But knowing this allows one to link together the variables that describe the macroscopic state of the system (variables such as empirical temperature, volume, pressure, magnetization, number of particles, etc.).

    Remember, entropy is simply a name given to a mathematical expression involving the quantity of heat and the temperature. This is not unusual; names are given to reoccurring mathematical expressions all the time. Clausius picked the name for the reasons he gave in that excerpt I showed above. There is no emotional baggage connected with the name.

    Knowing how it is calculated from the microscopic perspective of a system gives us a better handle on what is actually going on at that level. But this is no different from any other mathematical analysis of a physical system. It’s just another handle.

  28. fifth,

    I would say that practically speaking the continuum ends just shy of Damon and that particular spot is the objective ideal.

    All the other perspectives are approximations of that ideal.

    I’m guessing that’s because you want to reserve Damon’s spot for God. If so, then what’s so special about knowing the exact microstate of a thermodynamic system?

    How does it differ in principle from knowing the exact order of a deck of cards? Is the latter something you wish to deny to humans, also?

  29. keiths:

    You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    walto:

    And the answer is yes. I argued that some of it isn’t–just as some of the information that’s relevant to the probability of a coin coming up heads or a 3D object moving in a certain direction isn’t statistics.

    If the information is relevant to the probabilities, it’s statistical information. And if it’s relevant to the microstate probabilities of a thermodynamic system, it’s thermodynamic information.

    Do you seriously think that if scientists found a new way to gain information about a system’s microstate, they’d say “Oh, no, we can’t take advantage of this in our entropy calculations. It isn’t on walto’s List of Approved Thermodynamic Variables“?

  30. walto,

    You asked for a summary:

    I think it might be helpful for you to summarize what you think our differences are.

    I gave you one:

    You’ve disagreed with me on every single one of the following points:

    1. Entropy is not a measure of energy dispersal.
    2. Entropy is a measure of missing information regarding the microstate.
    3. Entropy is observer-dependent.
    4. Entropy is not a function of the system alone.
    5. Entropy is a function of the macrostate.
    6. The entropy is zero for an observer who knows the exact microstate.
    7. The second law is not violated for such an observer.
    8. Stipulating an observer-independent entropy forces it to be zero for everyone, rendering it useless for entropy comparisons.
    9. The ‘missing information’ interpretation of entropy works in all cases.

    I’m sure I’d find more if I were to reread the thread.

    You seem to have reversed your position on #1. As far as I can tell, you now agree that a) entropy is not a measure of energy dispersal, and b) that Lambert’s definition is therefore incorrect. (I’ve asked you directly, but you won’t give a direct answer.)

    What about the others? Do you still disagree with #2 through #9, or have you changed your mind on those, also?

  31. keiths: the information is relevant to the probabilities, it’s statistical information. And if it’s relevant to the microstate probabilities of a thermodynamic system, it’s thermodynamic information.

    I join Ben Naim (2012) in disagreeing with you about that:

    [T]he thermodynamic entropy is the maximal value of the SMI {Shannon Measure of Information] when applied to certain distribution functions relevant to the thermodynamic system. (emphasis added)

    I think it’s clear that he meant ONLY when so applied. There’s a lot of information available or potentially available that is relevant to the microstates but is nevertheless not relevant to thermodynamic entropy.

    As FMM, intimates, Damon isn’t just really really good at thermodynamics. He’s got other stuff going on.

  32. keiths,

    I’m afraid that for a lot of your list I’d have to say, “yes and no.” That’s mostly because I think “entropy” is ambiguous. With question 1, I think I’d give something like that answer–for different reasons, though.

    One exception to my yeses and nos, however:

    8. makes no sense to me at all. I’ve never agreed with that, and I don’t think you have either.

  33. keiths: You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    I would argue that. Is “thermodynamic information” a special kind of information? What are it’s units?

    Also, you’re conflating “Shannon information” with information in its colloquial sense.

  34. Mung,

    You’re missing the point. It’s walto who is arguing that only certain information, carried by Official Thermodynamic Variables™ such as temperature, is admissible in determining thermodynamic entropy.

    I’m saying that any information that helps you narrow down the microstate is admissible.

    As I put it earlier:

    Do you seriously think that if scientists found a new way to gain information about a system’s microstate, they’d say “Oh, no, we can’t take advantage of this in our entropy calculations. It isn’t on walto’s List of Approved Thermodynamic Variables“?

  35. Mung,

    Also, you’re conflating “Shannon information” with information in its colloquial sense.

    Where?

  36. walto,

    I’m afraid that for a lot of your list I’d have to say, “yes and no.” That’s mostly because I think “entropy” is ambiguous.

    Don’t play dumb. You know perfectly well that I’m talking about thermodynamic entropy in those nine points. I’ll repeat my question: Do you still disagree with #2 through #9, or have you changed your mind on those, also?

    With question 1, I think I’d say give something like that answer for different reasons, though.

    What reasons are those?

    8. makes no sense to me at all. I’ve never agreed with that, and I don’t think you have either.

    Sure I have, and I’ve explained why.

    Your #3…

    3. that probabilities are “objective” in the sense of not being relative to anybody’s (or everybody’s) knowledge of facts;

    …can’t be satisfied unless you adopt an all-or-nothing probability distribution, as I described earlier:

    For that to be true, the probability distribution function would have to have an infinite spike (the Dirac delta function) at the exact microstate and be zero elsewhere.

    And if you adopt such a distribution, then you’re back to zero entropy for all systems in equilibrium.

    By demanding that entropy be observer-independent, you force it to be zero, rendering entropy comparisons useless.

    The missing information view doesn’t have that problem, because it allows entropy to be observer-dependent. Observer-dependence in turn allows for nonzero values of entropy.

  37. I ask you to toss a coin. You toss a coin. I ask you of the coin came up heads. You answer yes, the coin came up heads. You have provided me with some information, information in the colloquial sense of information that we are all familiar with.

    How much information did you give me? How much “missing information” was there”? Can we quantify “an amount of information”? That’s what Shannon information is about.

  38. walto:

    I join Ben Naim (2012) in disagreeing with you about that:

    And:

    So I guess you don’t really quite agree with Ben-Naim either. 🙁

    Your argument from Lambert’s authority failed. Are you really going to try to do the same thing with Ben-Naim, who disagrees with Lambert?

  39. Mung,

    I’m asking you where I’ve conflated Shannon information with information in the colloquial sense.

    Quote, please.

  40. keiths:
    walto:

    And:

    Your argument from Lambert’s authority failed.Are you really going to try to do the same thing with Ben-Naim, who disagrees with Lambert?

    Am I REALLY? Uh-oh. Is this another trap!?!

  41. Me: You’re conflating “Shannon information” with information in its colloquial sense.

    keiths: I’m asking you where I’ve conflated Shannon information with information in the colloquial sense.

    Dude, the quote was in the post you responded to. Seriously.

    You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    Are you actually claiming that the measure of entropy provided by using Shannon’s measure of information provides information about a thermodynamic system’s microstate?

    Missing information is information that is MISSING. Right?

    So are both your uses of “information” in the above quote information in the colloquial sense? Or are both using “information” in the Shannon sense?

    Because if you are using the term in the colloquial sense you are not using it in the Shannon sense, and if you are not using it in the Shannon sense you are not talking about entropy.as a measure of missing information. And that is because Shannon information does not measure information in the colloquial sense of the word. You understand that, right?

  42. Self-made, self-sprung traps are the worse. Sal is still trying to get out of his. But he does seem to be spending some time away actually trying to learn more.

  43. Mung,

    Self-made, self-sprung traps are the worse. Sal is still trying to get out of his. But he does seem to be spending some time away actually trying to learn more.

    Face-saving isn’t the best motive, but if it spurs him to learn, it’s better than nothing.

  44. keiths:

    I’m asking you where I’ve conflated Shannon information with information in the colloquial sense.

    Mung:

    Dude, the quote was in the post you responded to. Seriously.

    Dude, there was no such conflation in that quote. Seriously.

    keiths:

    You’re seriously arguing that information about a thermodynamic system’s microstate is not thermodynamic information?

    Mung:

    Are you actually claiming that the measure of entropy provided by using Shannon’s measure of information provides information about a thermodynamic system’s microstate?

    No, I’m saying that any information that helps us narrow down a thermodynamic system’s microstate qualifies as thermodynamic information, and that walto’s argument that such information Must Be Excluded from entropy calculations is silly.

    Missing information is information that is MISSING. Right?

    Yes. Isn’t that amazing?

    So are both your uses of “information” in the above quote information in the colloquial sense? Or are both using “information” in the Shannon sense?

    Because if you are using the term in the colloquial sense you are not using it in the Shannon sense, and if you are not using it in the Shannon sense you are not talking about entropy.as a measure of missing information. And that is because Shannon information does not measure information in the colloquial sense of the word. You understand that, right?

    They aren’t mutually exclusive, Mung. You understand that, right?

Leave a Reply