In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. You see, this is not helpful. Tell us what entropy is.

    Read the thread, Alan. We’ve discussed it again and again.

  2. keiths: Read the thread, Alan. We’ve discussed it again and again.

    Anyone else with a statement (comprehensible to a bright child) of what entropy is? Keiths demurs apparently.

  3. Alan Fox: You see, this is not helpful. Tell us what entropy is. Surely it is quicker than listing all the things that it isn’t.

    I have no credentials to comment, but that won’t stop me.

    Entropy appears to be the result of a calculation, base on observations.

    ETA: Forgive me for being hopelessly ignorant on this, but when you say something “is” you are implying there is an explanatory theory. I’m not aware that there is any theory behind thermodynamics. It is an observation. An a formula.

    Back to the regularly scheduled program.

  4. In the interests of full disclosure, I will note that my “light red v dark red” question DOES have an answer: I was asking it of Sal in order to see whether he understood anything behind all those equations he is so fond of spouting.
    I actually think that, appropriately framed, disorder can be a useful teaching analogy for entropy, and that likewise “dispersal” is a useful teaching analogy.
    But, as keiths notes, analogies can break down. Many trees and electrons have been wasted thanks to the reification of the disorder analogy; dispersal is responsible for rather less carnage.
    At the risk of starting a whole new argument, how I like to think about entropy is

    Entropy is the arrow of time: if a process is reversible, delta-S is zero, otherwise it increases as time ticks forwards.

    which is NOT a useful teaching analogy…

  5. Correct me if I’m wrong, but the arrow of time is an observation rather than a theory. I’m not aware of any physical theory that requires time to move in one direction.

    Perhaps I’m a victim of woo here.

  6. DNA_Jock:
    In the interests of full disclosure, I will note that my “light red v dark red” question DOES have an answer: I was asking it of Sal in order to see whether he understood anything behind all those equations he is so fond of spouting.
    I actually think that, appropriately framed, disorder can be a useful teaching analogy for entropy, and that likewise “dispersal” is a useful teaching analogy.
    But, as keiths notes, analogies can break down. Many trees and electrons have been wasted thanks to the reification of the disorder analogy; dispersal is responsible for rather less carnage.
    At the risk of starting a whole new argument, how I like to think about entropy is

    which is NOT a useful teaching analogy…

    Interesting. Do you think that is translatable into an ignorance-measurement interpretation? Do you agree with me that Damon constitutes a reductio of the ignorance theory or with keiths that it does not? Thanks.

  7. walto,

    I agree with keiths that it does not.
    For Damon, the entropy before the gas expansion is zero.
    After the gas expansion, Damon now sees entropy of N kB ln(2), the same change that everyone else sees.
    However, Damon can become Damon-prime by determining which half each atom is in. He cannot do this without increasing the entropy of the rest of the universe (which includes himself) by at least that amount, so no 2LoT violation has occurred. This is the resolution to the Maxwell’s demon paradox.

  8. DNA_Jock,

    Thanks for that, but I remain confused (could be a permanent condition). I wonder if you would explain a bit more slowly. You write,

    DNA_Jock: After the gas expansion, Damon now sees entropy of N kB ln(2), the same change that everyone else sees.

    How is that assessment consistent with a poaition that maintains that Damon always (correctly) calculates an entropy of zero because of his complete understanding of every microstate? Have you worked out this problem by making two Damons?

  9. Walto,

    Sort of. Before the expansion, Damon has complete information. After the expansion, there’s an additional N bits of information, which old-Damon does not know. Even Damon cannot “magically” acquire this information. Damon has to do some work [heh] in order to update his knowledge base.
    If you were to define Damon as an entity that always correctly calculates…, then HIS entropy is going up when the gas expands.
    Hence the name, Damon the Demon…

  10. Alan Fox: Anyone else with a statement (comprehensible to a bright child) of what entropy is?

    Entropy is a special case of the Shannon measure of information.

    keiths is correct that this has been posted previously in the thread. Read my quotes from Ben-Naim.

  11. DNA_Jock:
    Walto,

    Sort of. Before the expansion, Damon has complete information. After the expansion, there’s an additional N bits of information, which old-Damon does not know. Even Damon cannot “magically” acquire this information. Damon has to do some work [heh] in order to update his knowledge base.
    If you were to define Damon as an entity that always correctly calculates…, then HIS entropy is going up when the gas expands.
    Hence the name, Damon the Demon…

    What prevents one from specifying a Damon as Laplace would, i.e., an individual that for any time t knows has all the info about every microstate at t?

    FWlittleIW, I’m inclined to agree with Ben-Naim (and Mung?) that the time’s arrow conception might not be consistent with the paucity-of-info conception.

  12. keiths: Case 2: The gases are indistinguishable…

    Let’s start with case two. In case two the probability distribution is the same in the before and after case. No change in the probability distribution means no change in SMI. There has been no change in “uncertainty”, no change in “missing information,” and no change in the number of yes/no questions needed to describe the system. No change in entropy.

    Since the number of possible microstates is the same, the entropy remains unchanged.

    The probability distribution remains unchanged. The entropy remains unchanged. ETA: The SMI remains unchanged.

    keiths: Case 1: The gases are distinguishable…

    How would you describe the initial states in terms of a probability distribution. It’s through the probability distribution that the connection is made to the Shannon measure of information.

    “Ways” is just another word for “microstates” here, so the number of possible microstates increases dramatically from “Before” to “After”.

    Would you agree that the probability distribution has become more uniform?

    Since W gets dramatically bigger when going from Before to After, the entropy S also increases…

    But it’s not really about W getting dramatically bigger, it’s about the probability distribution changing.

    What made the difference? The fact that in Case 1, the gases were distinguishable, while in Case 2, they were not.

    What made the difference between the two is the change in the probability distribution.

    my 0.02

  13. Mung: But it’s not really about W getting dramatically bigger, it’s about the probability distribution changing.

    I think keiths sees those as the same thing. But I ask you, do the number of degrees of freedom of a rigid body in a three-dimensional space change based on whether we know how this body is going to move? Does the “possibility space” of a coin change if one knows it will turn up tails?

  14. walto: But I ask you, do the number of degrees of freedom of a rigid body in a three-dimensional space change based on whether we know how this body is going to move?

    I could be way wrong here, but I don’t think the “degrees of freedom” of individual particles is relevant. I imagine they are averaged out and ignored.

    Does the “possibility space” of a coin change if one knows it will turn up tails?

    In my opinion, no. Unless it is a two-headed coin. 🙂

  15. https://en.wikipedia.org/wiki/Degrees_of_freedom_(physics_and_chemistry)

    In physics, a degree of freedom is an independent physical parameter in the formal description of the state of a physical system. The set of all dimensions of a system is known as a phase space, and degrees of freedom are sometimes referred to as its dimensions.

    ETA:

    In statistical mechanics, a degree of freedom is a single scalar number describing the microstate of a system. The specification of all microstates of a system is a point in the system’s phase space.

  16. Mung,

    For an ideal gas, the microstates at a given temperature are equiprobable. That’s why probabilities don’t appear explicitly in the Boltzmann equation:

    S = kb ln W

  17. walto:

    Do you agree with me that Damon constitutes a reductio of the ignorance theory or with keiths that it does not?

    DNA_Jock:

    I agree with keiths that it does not.
    For Damon, the entropy before the gas expansion is zero.

    Right.

    After the gas expansion, Damon now sees entropy of N kB ln(2), the same change that everyone else sees.

    That part I disagree with. Since Damon is a Laplacean demon, he can take his exact knowledge of the microstate and “simulate” it forward in time. If he knows the current microstate exactly, then he knows the future microstates exactly. The entropy is always zero from his perspective.

    However, Damon can become Damon-prime by determining which half each atom is in.

    Knowing the microstate of the system requires more than just knowing which side of the chamber each molecule is in. You need to know its position and momentum, and of course knowing the former tells you which side of the chamber it’s in.

  18. Now, we are ready to answer the question posed in the title [Does Entropy Change with Time?] of this section. The answer is: Definitely not! Of course, this question is meaningless if we do not specify the system we are dealing with. It is unfortunate that most authors of popular-science books will tell you that “entropy always increases.” This is a meaningless statement. We must first describe the system for which we ask about the entropy. A more meaningful question is: Given a thermodynamic system, i.e. a system of N (a very large number) particles in a box of volume V and having a fixed energy E, is it’s entropy a function of time? The answer is: No!

    The Briefest History of Time. p. 117

  19. Joe,

    Some final comments on the gas-mixing examples.

    My conclusion was

    So there’s no difference in energy dispersal between Case 1 and Case 2, but there is a difference in entropy. They can’t be the same thing.

    Entropy is not a measure of energy dispersal.

    This raises a question. Lambert is aware of the issue; he knows that entropy increases when the gases are distinguishable, but not otherwise. How does he reconcile this with the energy dispersal view of entropy? If the change in dispersal is the same in Case 1 and Case 2 — zero — then how can Lambert continue to believe that entropy is a measure of dispersal?

    Here’s his rationalization:

    The motional energy of the molecules of each component is more dispersed in a solution than is the motional energy of those molecules in the component’s pure state.

    [Emphasis added]

    That’s true, but notice that he has equivocated. Before, he claimed that entropy was just a measure of energy dispersal…

    Entropy change is measured by the dispersal of energy: how much is spread out in a process, or how widely dispersed it becomes — always at a specific temperature.

    …and now he’s saying that the entropy is a measure of the dispersal of energy of each component, considered separately and then combined.

    If that were the case, we should be able to designate any two components, measure the change in their individual energy dispersals, and add them together to get the entropy increase for the entire system.

    It doesn’t work.

    Let’s say that before the partition is removed, we designate the molecules on the left side as one component and the molecules on the right side as the other component. After the partition is removed, the molecules from each component spread to the other side. Their energy is dispersed. Therefore, the system’s entropy must have increased, right? Wrong. If the two components consist of molecules of the same gas — that’s Case 2 — then there is no entropy increase, as everyone, including Lambert, agrees. It’s only when the gases are distinguishable — Case 1 — that entropy increases.

    This immediately creates problems for Lambert and the dispersalists. I’ll address those in my next comment.

  20. You might ask “What’s the problem with taking ‘components’ to mean ‘distinguishable components’?”

    1. In ‘deciding’ whether energy should disperse, the laws of physics don’t ‘pay attention’ to whether the gases are distinguishable, any more than they pay attention to the numbers on a pair of colliding billiard balls. Physical energy dispersal does not care about distinguishability.

    2. What does ‘distinguishable’ really mean? Distinguishable to whom? In my thought experiment, Yolanda could distinguish between the two isotopes — the two components — because she had the right equipment. Lacking that equipment, Xavier couldn’t tell the difference. It was all the same ‘component’ to him.

    Lambert offers no principled criterion for determining whether a component is ‘really’ distinguishable, as opposed to being distinguishable to this or that observer.

    3. If he tried to argue that ‘distinguishable’ just means ‘distinguishable in principle’, then he would run into the ‘Damon the Demon problem’. A Laplacean demon like Damon can not only distinguish one gas from another or one isotope from another — he can distinguish every molecule from every other, because each molecule has a distinct position and momentum. Needless to say, entropy calculations based on the distinguishability of every particle will not give the same results that physicists and chemists get for Case 1 and Case 2.

    The dispersalist view of entropy just doesn’t work.

  21. Finally, note that all of those problems vanish when you adopt the “missing information” view of entropy.

    Distinguishability is observer-dependent, macrostates are observer-dependent, missing information is observer-dependent, and entropy is observer-dependent.

    It also nicely solves DNA_Jock’s puzzle for Sal:

    Let’s repeat this experiment a million times, but with balls that are dark red and light red. But in each of the experiments, we vary the hue of the balls ever so slightly.
    The entropy of mixing does not change at all, it is always 10.85 J/K, except when the balls are exactly the same color. At this one moment, the entropy of mixing suddenly jumps to ZERO.
    The question I have been asking you repeatedly, and you have (as usual) not even tried to answer, is WHY is this the case under your “spreading” definition? WHY is there this sudden plummet in the mixing entropy when we go from red balls with ever-so-slightly darker red balls to indistinguishable balls. You’ll notice that it follows automatically from keiths’s definition. Your definition seems sadly lacking in this regard.

    And note that the entropy doesn’t vanish only when the hues are exactly the same. It also vanishes when the hues are too close to be distinguished by the observer in question. That can vary from observer to observer.

    To take an extreme case, a blind observer (who detects the balls by means other than vision) won’t be able to distinguish them regardless of hue. To such an observer, the entropy of the system never increases.

    The underlying physics is always the same. Any physical energy dispersal is always the same. The differences are in distinguishability, and distinguishability is observer-relative.

    Entropy is a measure of missing information, not of energy dispersal.

  22. walto,

    In light of all of the above, do you see your mistake now? Dispersalism just isn’t tenable.

  23. keiths:
    walto,

    In light of all of the above, do you see your mistake now?Dispersalism just isn’t tenable.

    Huh?

    Was any of that supposed to be responsive to any of my posts? Did you answer any of my questions? Were you addressing someone else maybe? Or are you just off in your own world again?

    Also, do you now see that your 5 and 6 are question-begging? If not, I’m not sure there’s much sense in us continuing. That’s a pretty elementary logical point.

  24. walto,

    You and Sal, following the lead of Lambert, Elzinga, et al, have embraced dispersalism. I’ve shown you why dispersalism cannot be correct. You’ve been unable to refute my arguments.

    Given that, why are you still a dispersalist?

    Whether I’ve answered any particular question of yours does not matter unless you can refute my arguments against dispersalism.

    I haven’t seen any evidence that you can. So again, why are you still a dispersalist? Why not change your position in response to evidence and argument?

  25. Points 5 and 6 are not question begging, walto.

    Everyone, including Lambert and the dispersalists, accepts the Boltzmann and Gibbs equations. They also accept the relevance of distinguishability. I’ve explained this to you already.

    And even if those points actually were question begging, you haven’t refuted the other four, any one of which is sufficient to defeat dispersalism.

    Why are you still a dispersalist?

  26. Joe Felsenstein:

    Is there a case where the one decreases but the other increases?

    If you’re talking about order vs. entropy, yes.

    From:

    Entropy and Disorder from a creationist book endorsed by a Nobel Laureate

    “We noted earlier that entropy can be correlated-but not identified-with disorder. And we said, moreover, that this correlation is valid in only three cases-ideal gases, isotope mixtures, and crystals near zero degrees Kelvin. The truth of the matter is illustrated by considering the two chemically inert gases, helium, and argon.(7) In our mind’s eye we imagine two balloons, one filled with helium and the other with argon. First, we lower the temperature of both balloons to zero degrees Kelvin. This makes all the gas molecules stop moving in either balloon. Next, we get the molecules moving by heating both balloons to 300 degrees Kelvin (room temperature). Were we to mathematically calculate the increase in entropy, we would find that it was 20 percent higher in the argon balloon than in the helium balloon (154 v. 127 joules per mole per degree Kelvin). But since helium molecules are ten times lighter than argon molecules, they are moving three times faster and thus are more disordered. Here, then, is an example where higher entropy is accompanied by lower disorder, thereby demonstrating that we cannot identify one with the other.

    Alan Fox asked “what is entropy”:

    For the fixed volume (isochoric case), entropy is a bit of an abstract mathematical concept describing the energy dispersed in a system at a given temperature.

    Example, it takes 6660 joules energy to melt a 20g ice cube at 273 kelvin, the change in entropy is:

    delta-S = delta-q/ T = 6660 Joules / 273 Kelvin = 24.4 J/K entropy

    Simple!

    We dispersed 6660 Joules at 273 Kelvin into an ice cube.

    For chemists, they use entropy change to estimate how likely a chemical reaction will go forward.

  27. Below is a basic example of a chemistry student calculating entropy change for a chemical reaction and then the entropy change figure is used to compute another number (delta G) that determines if a chemical reaction tends to be spontaneous.

    This is tedious, but I show it in contrast to the ignorance/woo approach to entropy.

    https://www.chem.tamu.edu/class/majors/tutorialnotefiles/gibbs.htm

    Entropy, thus has some relevance to the origin of life question and how spontaneously life might generate itself from various chemical contexts.

  28. In the case of the expanding gas, if the pink gas molecules were expanded and then recompressed to be only in the left chamber as it originally was, energy has to be expended to do this and it creates heat.

    The expansion at the same temperature does not add or subtract heat, but the compression will. The act of compression is reversing the process of expansion, and it is like injecting heat into the gas when compressing it (it gets hotter).

    The Clausius integral is Q-reversible!

    So Keiths complained where is the energy dispersal in gas expansion. I gave several answers, but in terms of Clausius we look at the reversible heat energy (the heat energy involved in reversing the process).

    To keep the process isothermal during compression we have to keep dispersing heat out of the gas. So the entropy change is negative during isothermal re-compression, this implies the process of expansion will have the same magnitude of entropy except a positive sign.

    So again the energy dispersal view is shown to be a correct description.

  29. Sal,

    So Keiths complained where is the energy dispersal in gas expansion.

    No, I didn’t.

    If you want to know what I said, read my comments. Your paraphrases are hopeless.

  30. Sal,

    So again the energy dispersal view is shown to be a correct description.

    No. As I explained earlier, you’re mistaking correlation for identity:

    The error of the dispersalists is that they’ve mistaken correlation for identity. There is a strong correlation: entropy increases are often accompanied by energy dispersal. The dispersalists have leaped to the conclusion that entropy is energy dispersal.

    It isn’t, for the reasons given above.

    If dispersalism were actually correct, you would be able to refute my six points, explain why Yolanda and Xavier calculate different entropy values in my thought experiment, and give a coherent dispersalist answer to DNA_Jock’s question about distinguishability.

    You’ve done none of those things, and neither has walto.

  31. Mung: Entropy is a special case of the Shannon measure of information.

    OK.

    Let me ask a couple of questions. A stream of digital signals (or coin tosses) represent 0 or 1. So a continuous stream of 1s or 0s contains no information? They can also be compressed to a brief verbal description. They have no Shannon entropy? A truly random, non-repeating sequence of digital signals has maximum entropy? It cannot be compressed so the map has to be the territory. A digital output from a continuously monitoring camera set up to move and capture an ever-changing view will also be unpredictable and incompressible. The random sequence has no information while the video output can be displayed as an image on a screen. Yet the Shannon entropy is the same, no?

    Final question for the moment:

    What is the relationship between Clausius entropy, Boltzmann entropy and Shannon entropy? Superficial or fundamental?

    Anyone can answer!

  32. I see there are some responses to me upthread, for which thanks. RL means no time till later in the week to respond.

  33. keiths: There is a strong correlation: entropy increases are often accompanied by energy dispersal

    How do you account for that strong correlation?

  34. Alan Fox: Let me ask a couple of questions. A stream of digital signals (or coin tosses) represent 0 or 1. So a continuous stream of 1s or 0s contains no information? They can also be compressed to a brief verbal description. They have no Shannon entropy? A truly random, non-repeating sequence of digital signals has maximum entropy? It cannot be compressed so the map has to be the territory. A digital output from a continuously monitoring camera set up to move and capture an ever-changing view will also be unpredictable and incompressible. The random sequence has no information while the video output can be displayed as an image on a screen. Yet the Shannon entropy is the same, no?

    Final question for the moment:

    What is the relationship between Clausius entropy, Boltzmann entropy and Shannon entropy? Superficial or fundamental?

    Excellent questions, Alan. Basically what I’ve been asking, much more inarticulately, this entire thread.

    The responses I have received have been largely of this form:

    “Entropy” has but one meaning, viz. lack of information of the microstates. Anybody who says that anything else could be entropy is simply wrong.

    And then I have gotten a lot of arguments (ranging from OK by my lights to pretty obviously question-begging) about why it can’t be a measurement of the dispersal of energy.

    I’ve been asked (begged really) to either admit that the ignorance definition above is correct or defend the dispersal view. And then I’ve also been asked by posters to butt out because two guys who have criticized Sal and me seem like they might have taken a graduate course or two in this subject. (I note, however, that those two critics seem not to agree with each other on these issues anymore.)

    Joe, too, has asked why the term can’t be, as it seems to be, equivocal. After all, it does not seem like the chemistry students in Sal’s example are measuring quantities of ignorance, and it also seems a reductio of that position that it entails that for one who knows everything, the entropy of every macrostate is zero. But Joe too, has been treated only with arguments against the dispersal theory.

    In other words, an interesting question has been turned into the typical TSZ shit show, even though, in this thread at least, there has been pretty much zero God plumping.

    Anyhow, your questions above are extremely pertinent, both to the “more than one meaning” suggestion, and to questions about how the actually USED equations can follow from Boltzmann’s equation without some of the physical science concepts having been pumped in somewhere. I’m not sure, but Jock, who had suggested an arrow of time interpretation, may now see some of the sleight of hand that is necessary to insist that entropy can be NOTHING BUT an assessment of missing information from one’s perspective.

  35. stcordova: Alan Fox asked “what is entropy”:

    For the fixed volume (isochoric case), entropy is a bit of an abstract mathematical concept describing the energy dispersed in a system at a given temperature.

    It would appear then, that what entropy is changes depending on the individual case. According to Sal.

    Or, it’s a bit of an abstract mathematical concept.

    Does that help, Alan?

  36. Salvador can do entropy calculations. Therefore, Salvador must know what entropy is. Well, no. That doesn’t at all follow.

    Then, when someone gives him entropy calculations to perform that conflict with his dispersal thesis, then he can’t do the calculations. Does anyone else find that odd?

  37. The dispersion process is reversible? What happens to entropy when energy “undisperses”?

  38. Mung,

    Mung, my sense is that Sal’s definition is shifting because the concept is both ambiguous and, in some sense determined by its history and Boltzmann “ownership”, but he’s been trying to produce a single definition that handles all the cases.. OTOH, keiths’ is absolutely unshifting only because he refuses to consider scenarios in which it makes no sense or the possibility that it is only specific, entirely non-observer dependent information that matters if we settle on an ignorance theory. There are good reasons for his refusal to answer a number of the questions he’s been asked. He’s a smart guy, and doesn’t care about the issues nearly as much as he does about being right.

  39. Alan asked some questions upthread a few comments.

    A stream of digital signals (or coin tosses) represent 0 or 1. So a continuous stream of 1s or 0s contains no information?

    Whether or not the sequence conveys information is not germaine to the mathematical theory. People often confuse colloquial information with Shannon’s measure of information. Shannon’s measure does not measure colloquial information. It is “a measure of information” in a restricted, mathematical sense.

    So when we talk about entropy and speak in terms of missing information we are not talking about the same thing as “my dog ate my paper” and the information in it went missing.

    They can also be compressed to a brief verbal description. They have no Shannon entropy?

    I try not to use the term Shannon entropy, because people confuse that with thermodynamic entropy. They are not the same. As for the Shannon measure, it can be defined on any probability distribution. So if you have something for which probability distribution can be given then there is a Shannon measure of information for it.

    Any “verbal description” is not information in the Shannon sense, and “compression” is also not relevant.

  40. Mung: I try not to use the term Shannon entropy, because people confuse that with thermodynamic entropy.

    Your ally here has insisted all thread that “entropy” has but one meaning, viz., missing information.

  41. walto: Mung, my sense is that …

    I do think keiths and DNA_jock have raised serious enough questions to doubt the “energy dispersal” description of what entropy measures. That along with my quotes from Ben-Naim.

    My main interest has been in showing that Sal is also wrong in the OP when he claims there is no information aspect to entropy. Aside from that I’ve also been trying to show why entropy is not subjective.

    I’m still a bit puzzled about how something that is “observer-dependent” is not subjective. I wonder if Einstein struggled with that. Perhaps the way to think of it is that the equations work for any observer. Perhaps it would be better to say “information-dependent” rather than observer-dependent.

    It hasn’t been quite clear to me what thermodynamic system it is that the three are looking at. To me, if we are going to speak of entropy, we need to be talking about a specific thermodynamic system for which we can determine the thermodynamic variables. Calculating entropy on the fly for anything and everything like Sal does doesn’t cut it for me. =p

  42. walto: Your ally here has insisted all thread that “entropy” has but one meaning, viz., missing information.

    Entropy may be “the amount of missing information” but it does not follow that all “missing information” is entropy. I hope keiths agrees with that.

    For me entropy has a specific meaning in a specific context, and the context is a thermodynamic system in an equilibrium state. You may be able to calculate the SMI at different points in a process, but that doesn’t mean you’re measuring the entropy.

  43. Alan Fox: A truly random, non-repeating sequence of digital signals has maximum entropy?

    In the Shannon sense of entropy, yes. In the thermodynamic sense, no. What connects the two concepts is the probability distribution. When you have a uniform probability distribution you have the maximum uncertainty. This is true whether you’re tossing coins, or dice, or identifying the location/momentum of particles in a thermodynamic system.

  44. walto: Your ally here has insisted all thread that “entropy” has but one meaning, viz., missing information.

    My understanding is that the missing information covers all cases.

    It seems to me that as scientific definitions get more precise and less ambiguous, they become less intuitive, less amenable to metaphor, and more difficult to understand without some advanced math.

  45. Mung: Entropy may be “the amount of missing information” but it does not follow that all “missing information” is entropy.

    Suppose (this is just a hypothetical!) that the only relevant missing information were to involve energy dispersal with respect to a non-observer-relative perspective. What I mean is that the amount that can be missing from any perspective is defined in such a way that it must, e.g., vary directly with temperature. And the definition entails that there can be no perspective according to which the missing information can be zero unless certain physical properties obtain. (That’s to eliminate the Damon reductio.)

    If that were the case, wouldn’t this whole bruhaha just be a quibble? Is it strictly a measure of missing info? Yes. Is the measurement also necessarily related to energy dispersal? Yes.

  46. Mung:

    I try not to use the term Shannon entropy, because people confuse that with thermodynamic entropy. They are not the same.

    walto:

    Your ally here has insisted all thread that “entropy” has but one meaning, viz., missing information.

    I’ve been much more precise than that. For example:

    Entropy is a measure of missing information — the gap between the information associated with the macrostate and the information associated with the microstate.

    And:

    The missing information is the gap between what you know about the system — the macrostate — and its actual microstate at the moment. The size of that gap is the entropy, and so what the Second Law says is that the gap will (almost) always either remain the same or increase in size, with enormous probability.

    And:

    The missing knowledge is the difference between knowing the macrostate and knowing the microstate. When the temperature is absolute zero, there is no difference, and thus no missing knowledge, because there is only one possible microstate for that macrostate.

    …and so on, throughout the thread.

  47. walto,

    Suppose (this is just a hypothetical!) that the only relevant missing information were to involve energy dispersal with respect to a non-observer-relative perspective. What I mean is that the amount that can be missing from any perspective is defined in such a way that it must, e.g., vary directly with temperature.

    You’re asking for the impossible: a missing-information-based definition of entropy in which every potential observer, actual or hypothetical, possesses the same incomplete information about the system’s detailed state — no more, and no less.

    That will never be the case as long as observers have the option of taking — or not taking — particular measurements. Suppose everyone knows the temperature and volume of a system containing an ideal gas, except for one observer who declines to measure the volume. S/he will have less information than the other observers.

    And the definition entails that there can be no perspective according to which the missing information can be zero unless certain physical properties obtain. (That’s to eliminate the Damon reductio.)

    There is no “Damon reductio”. That Damon sees an entropy of zero is perfectly reasonable and not absurd at all, and it doesn’t engender a Second Law violation, contrary to your earlier assertion.

  48. petrushka,

    My understanding is that the missing information [interpretation] covers all cases.

    I can’t think of a counterexample, and Sal and walto certainly haven’t come up with any.

    It’s odd. They’ve been shown that the dispersal interpretation of entropy cannot be correct. Entropy has the wrong units, it doesn’t covary with the energy dispersal, energy dispersal can’t be determined from the entropy alone, and so on. They’ve also been given a specific example in which the dispersal interpretation fails.

    They face all of that evidence against dispersalism, but they cannot come up with any evidence against the missing information interpretation. They also haven’t presented a single case in which the missing information interpretation fails.

    Yet both of them are still dispersalists. It’s profoundly irrational.

Leave a Reply