In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths: The deep irony is that Lambert, whose mission is to root out the “entropy is disorder” misconception, has replaced it with another misconception.

    What we ought to do is show that entropy is a measure of disorder, at least as much so as it is a measure of spreading, lol.

    We need to have someone defend entropy is a measure of disorder! I bet I can find lots of texts that say it is.

  2. keiths:
    walto:

    Emphatically no.

    The thermodynamic entropy of the chemists is the same as the thermodynamic entropy of the “missing informationists”, and it’s governed by the same equations. There’s no need for a new name.

    The problem is not that two concepts are “competing” for the same name.It’s simply that Lambert, Elzinga, Sal, and everyone else who buys into the idea that entropy is a measure of energy dispersal is mistaken.

    The deep irony is that Lambert, whose mission is to root out the “entropy is disorder” misconception, has replaced it with another misconception.

    Can you answer any of my last few questions to you? Thanks. (I’ve given up on the earlier ones.)

  3. walto: Can you explain the necessary connection between missing information and the expectation that in isolated systems entropy is always increasing? Why should that be the case? Or is the 2nd Law just wrong?

    No, the second law is not wrong.

    As you move from a non-uniform distribution to a uniform distribution it requires more yes/no questions (iow the missing information is increasing). The “entropy” is maximized when all states are equally probable (a uniform distribution).

    Check out the simulations here:

    http://ariehbennaim.com/books/discover.html

    ETA: more simulations

    http://ariehbennaim.com/simulations/index.htm

  4. The error of the dispersalists is that they’ve mistaken correlation for identity. There is a strong correlation: entropy increases are often accompanied by energy dispersal. The dispersalists have leaped to the conclusion that entropy is energy dispersal.

    It’s not, and now may be a good time to review some of the reasons:

    1. Entropy has the wrong units. Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

  5. And in case it isn’t obvious, none of the five issues above are a problem when entropy is viewed as a measure of missing information.

  6. Another one for the list:

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

  7. I wonder why Sal thinks “spreading” is better than “disorder.” Does the energy spread out in a more orderly way in spreading? How is it that the spreading out does not contribute to the disorder, such that as one increases so does the other?

    What’s the difference?

  8. walto,

    keiths asks, which is the REAL chance of such-and-such happening, if it is not that calculated by Xavier, Yolanda, or Damon.

    No, I pointed out that Xavier, Yolanda, and Damon the Demon each calculate a different value of entropy despite observing the same system. That makes perfect sense on the “entropy as missing information” view, because each of the three is missing a different amount of information about the system’s microstate. It makes no sense at all on the “entropy as energy dispersal” view, because energy dispersal is determined by the physics, not by the observers, and is thus the same for all three.

    If there is a single correct value for the entropy, then at least two of the observers must be wrong. You seem to think that all three are wrong: you reject Xavier’s and Yolanda’s values because they are in worse “epistemic positions” than Damon; yet you also reject Damon’s value of zero, despite the fact that he is in a perfect epistemic position.

    What is the single correct value, then, and how do you go about determining it?

    My answer: there is no single correct value. Xavier, Yolanda, and Damon all calculate the entropy correctly. It is different for each of them because each of them lacks a different amount of information about the system’s microstate. That is, the information gap between macrostate and microstate differs for each of them.

  9. The second descriptor of entropy, “spreading,” was probably suggested by Guggenheim (1949). Guggenheim started with the Boltzmann definition of entropy in the form

    S(E) = k log Omega(E)

    where k is a constant and “Omega(E)” denotes the number of accessible independent quantum states of energy E for a closed system.” This is the correct definition of Omega.

    Later in his article, Guggenheim discusses the process of heat flow in which an increase in entropy represents “an increase in the number of states, more briefly an increase of accessibility or spread.

    The first part of he quotation is correct, namely the change in entropy in the spontaneous process of heat flow represents an increase in the number of accessible states. The most important word in this sentence is “number.” In the second part of the sentence, this word is deleted and what remains is “increase of accessibility or spread.” Obviously, this is a deficient description of the change in Omega.

    – Ben-Naim (2012).

  10. walto,

    Can you explain the necessary connection between missing information and the expectation that in isolated systems entropy is always increasing? Why should that be the case? Or is the 2nd Law just wrong?

    Sure. It’s a consequence of physics together with probability.

    First, let me note that the Second Law, properly stated, is a probabilistic law. It doesn’t actually forbid entropy from decreasing; it’s just that decreases in entropy are astronomically unlikely in all but the tiniest systems.

    The laws of physics are reversible, as far as we know, so if you could get an isolated system into exactly the right initial state, it could happily shed entropy for years. It’s just enormously unlikely that such an initial state would ever occur by chance.

    Now back to the missing information. The missing information is the gap between what you know about the system — the macrostate — and its actual microstate at the moment. The size of that gap is the entropy, and so what the Second Law says is that the gap will (almost) always either remain the same or increase in size, with enormous probability.

    Why does the gap increase? We can get into more detail if needed, but the quick answer is that as the system evolves (from our epistemically limited point of view), it goes through a sequence of macrostates, and the later macrostates correspond to more microstates than the earlier ones.

    Toy example:

    A system evolves through macrostates M1, M2, and M3, corresponding to 2, 24, and 307 microstates respectively. In M1 we can pin the microstate down pretty well: it has to be one or the other of the two possible microstates. In M2 we can only pin it down to one of 24, and in M3 it’s one of 307. The information gap between the macrostate and the microstate has increased over time. That’s the second law in action.

  11. keiths:
    walto,

    No, I pointed out that Xavier, Yolanda, and Damon the Demon each calculate a different value of entropy despite observing the same system. That makes perfect sense on the “entropy as missing information” view, because each of the three is missing a different amount of information about the system’s microstate.It makes no sense at all on the “entropy as energy dispersal” view, because energy dispersal is determined by the physics, not by the observers, and is thus the same for all three.

    If there is a single correct value for the entropy, then at least two of the observers must be wrong.You seem to think that all three are wrong: you reject Xavier’s and Yolanda’s values because they are in worse “epistemic positions” than Damon; yet you also reject Damon’s value of zero, despite the fact that he is in a perfect epistemic position.

    What is the single correct value, then, and how do you go about determining it?

    My answer:there is no single correct value.Xavier, Yolanda, and Damon all calculate the entropy correctly.It is different for each of them because each of them lacks a different amount of information about the system’s microstate. That is, the information gap between macrostate and microstate differs for each of them.

    That’s pretty much what I said. Except for yoour opening No.

    I defended that picture of the Boltzmann concept–in spite of your weird reluctance to answering any of my last dozen or so questions.

  12. keiths: A system evolves through macrostates M1, M2, and M3, corresponding to 2, 24, and 307 microstates respectively. In M1 we can pin the microstate down pretty well: it has to be one or the other of the two possible microstates. In M2 we can only pin it down to one of 24, and in M3 it’s one of 307.

    Why does M3 correspond to 307 microstates rather than something like 17? Is there a reason for this or is it just how it is?

    peace

  13. colewd:

    I personally am confused by the subject and hope this discussion will yield more clarity. I think discussing the third law which describes entropy at absolute zero may help.

    Let’s revisit our earlier exchange.

    colewd:

    When does entropy equal zero?

    keiths:

    When the observer has complete knowledge of the microstate (see the example of Damon the Demon, earlier in the thread). In that case, there is only one possible microstate, and so W = 1.

    Plugging into Boltzmann’s formula, you get

    S = k ln W = k ln (1) = 0.

    colewd:

    I would answer at a temperature of absolute zero.

    It amounts to the same thing. There’s only one possible microstate at absolute zero. (In classical thermodynamics, anyway, and ignoring “frustrated materials”. Google the latter if you care. I’m not sure about quantum thermodynamics, where there’s residual motion in atoms even at absolute zero. Does that count as multiple microstates for entropy purposes? I don’t know the answer. If it does, that would suggest that entropy can never be zero in quantum thermo. On the other hand, I’ve read that entropy can actually be negative in quantum systems, so would zero be impossible? I just don’t know. )

    colewd:

    Interesting disconnect and may support Einsteins comments about s=k ln W.

    There’s no disconnect. Entropy becomes zero when you reduce the number of epistemically possible microstates to one. Damon does it by learning the exact microstate that the system is in. Lowering the temperature to absolute zero does it by reducing the number of physically possible microstates to one, so that the number of epistemically possible microstates is one for all observers.

    Either way, ln(1) = 0, so the entropy is zero.

  14. walto,

    That’s pretty much what I said.

    No, you said this:

    keiths asks, which is the REAL chance of such-and-such happening, if it is not that calculated by Xavier, Yolanda, or Damon.

    Entropy is not “the REAL chance of such-and-such happening”. I’m disappointed that you haven’t even learned that from this thread.

  15. keiths: Entropy is not “the REAL chance of such-and-such happening”. I’m disappointed that you haven’t even learned that from this thread.

    I know what I wrote. I’m sorry you didn’t understand that by the REAL chance, I meant some observer-independent probability of the microstate in question obtaining. I’d have thought that would be obvious; especially as I was agreeing with your position (on Boltzmann’s concept) in that post. But alas.

    Now how about answering some of my questions instead of this slipping and jabbing instead? I know it’s your thing, but still….

  16. keiths, While you’re at it I have another question, re your response to colewd above. Why must something being at a temperature of absolute zero imply anything about missing knowledge? Is the idea that one COULD know that the possibilities are limited by that lack of heat? If that’s all that’s required then I’d think we’d be back with Damon on every estimate. For one COULD know what Damon knows, right?

  17. While the argument continues, let me comment on what role entropy calculations play in arguments against evolution. Basically, none, no matter what definition of entropy anyone uses.

    Granville Sewell’s “X-entropy” equations simply model dispersion of matter of a chemical element or compound. He uses this to argue that concentrations of (say) carbon in a living form violate the predictions of those equations. But the equations have no terms for interactions with other chemicals (nor any for electrostatic attraction or repulsion, or for effects of gravity). So the carbon in his model can’t be oxidized into carbon dioxide or made into sugars by photosynthesis, or have any interesting chemistry happen to it. Light can’t hit plants, photosynthesis can’t make energetic compounds, herbivores can’t eat that, carnivores can’t eat them, and bacteria can’t decompose them. Even plants can’t grow.

    We’ve been over this ground repeatedly. People have pointed out that the evaporation of water from a glass of salt water will result in crystals forming in the bottom of the glass. Except not in Sewell’s equations, which predict that the sodium and the chlorine keep diffusing indefinitely outwards. Basically none of the interesting processes that contribute to evolution are allowed in his equations, or even the processes that contribute to the growth of individuals.

    So no matter how the debate over the definition of entropy turns out, it will not validate creationist second-law arguments.

  18. walto,

    I know what I wrote. I’m sorry you didn’t understand that by the REAL chance, I meant some observer-independent probability of the microstate in question obtaining.

    I know what you meant, and I was trying to correct your confusion. Let me try another way.

    The microstate in question is the actual microstate, and the observer-independent probability of it obtaining is 1. If you insist on an observer-independent entropy, the answer you get is that entropy is zero, all the time, regardless of observer. That’s useless.

    I keep explaining this to you, but it obviously isn’t sinking in. That’s why my question is important:

    If there is a single correct value for the entropy, then at least two of the observers must be wrong. You seem to think that all three are wrong: you reject Xavier’s and Yolanda’s values because they are in worse “epistemic positions” than Damon; yet you also reject Damon’s value of zero, despite the fact that he is in a perfect epistemic position.

    What is the single correct value, then, and how do you go about determining it?

    If you would actually try to answer the question, rather than finding excuses to avoid it, the light might dawn: there is no single correct value of entropy, because there is no single correct macrostate.

  19. OMG. How many times would you like me to agree with you on the question of whether the ‘probability’ of the occurence of this or that microstate being actual is relative to the observer’s information. Would three more times do it, or do you need a blow job?

    Now. Can you not answer any of my questions, or what?

  20. walto,

    Now how about answering some of my questions instead of this slipping and jabbing instead? I know it’s your thing, but still….

    I’ve answered a ton of your questions in this thread. I am not obligated to answer every single one. I’m doing my best to help you and other readers understand this stuff, and that goal is not well served if I respond to every question of yours as if it were equally relevant and important.

    Keep in mind that your own confusions and limitations are not universal. I’m trying to make comments that are valuable to other readers as well as you.

    The “angry old man” shtick is getting very old. Could you drop it for the duration of this thread, at least?

    I know you’re frustrated by your confusion and your inability to grasp this stuff, but is lashing out at your teacher really a productive way of dealing with that frustration?

  21. OK. I’m going to infer from that that you can’t answer any of them, and that my inference that this fight is just about who gets the name ‘entropy’ is right–or at least you have no responses to it.

    My questions about the derivability of these equations is obviously key to the truth of my contention. But it’s like your sole interest in these matters is proving this or that person wrong. The actual issues are of no apparent interest to you at all. Weird way to live. But who am I to judge?

  22. walto,

    OMG. How many times would you like me to agree with you on the question of whether the ‘probability’ of the occurence of this or that microstate being actual is relative to the observer’s information.

    Read your comment again:

    I know what I wrote. I’m sorry you didn’t understand that by the REAL chance, I meant some observer-independent probability of the microstate in question obtaining.

    Which is it, observer-independent or observer-relative? Make up your mind.

    Would three more times do it, or do you need a blow job?

    [emphasis added]

    If you would spend less time giving blowjobs and more time thinking about entropy, you might make some progress.

    I’ve posed a question that you should be able to answer, if your position is actually correct:

    If there is a single correct value for the entropy, then at least two of the observers must be wrong. You seem to think that all three are wrong: you reject Xavier’s and Yolanda’s values because they are in worse “epistemic positions” than Damon; yet you also reject Damon’s value of zero, despite the fact that he is in a perfect epistemic position.

    What is the single correct value, then, and how do you go about determining it?

    If you can’t answer that question — and it’s pretty clear that you can’t — then there is something wrong with your position.

    If you would actually try to answer the question, it might dawn on you that there is no single correct value of entropy because there is no single correct macrostate.

  23. walto,

    keiths, While you’re at it I have another question, re your response to colewd above. Why must something being at a temperature of absolute zero imply anything about missing knowledge?

    The missing knowledge is the difference between knowing the macrostate and knowing the microstate. When the temperature is absolute zero, there is no difference, and thus no missing knowledge, because there is only one possible microstate for that macrostate.

    Is the idea that one COULD know that the possibilities are limited by that lack of heat?

    No, it’s that one does know that the possibilities are limited by the lack of heat. As temperature goes down, the number of possible microstates decreases until there is just one left (with some exceptions, like the “frustrated materials” I mentioned above).

    If that’s all that’s required then I’d think we’d be back with Damon on every estimate. For one COULD know what Damon knows, right?

    Only if you had his superpowers or something comparable. At a given normal temperature there are zillions of possible microstates. Damon knows which one is the actual microstate. You and I know that the actual microstate must be among those zillions, but we don’t know which one it is. There’s missing information, so the entropy is nonzero for us.

    For Damon the entropy is zero, because he knows the exact microstate. He has singled it out from among the zillions.

  24. keiths: No, it’s that one does know that the possibilities are limited by the lack of heat. As temperature goes down, the number of possible microstates decreases until there is just one left (with some exceptions, like the “frustrated materials” I mentioned above).

    Who knows that? Everyone? Has everyone always known that? You must mean that this information is available, that it COULD be, or COULD HAVE BEEN known.

    But, as I mentioned above, so could whatever Damon knows be known by anyone: the info is there. Thus, either the number of microstates is a function of actual knowledge or it as a function of the temperature (which may or may not actually be known), and, thus POTENTIAL knowledge–based on the availability, rather than the reception of the information.
    If the latter, every microstate has the same entropy–whatever it’s possible for Damon to calculate.

    I agree with you that that is wrong. What you don’t get is that it demonstrates a problem for moving from an objective feature of the world to a Boltzmann-type entropy. It suggests there are TWO competing concepts.

  25. walto,

    Wow.

    Wow, what? I’m trying to discuss thermodynamics, and you’re bringing up blowjobs, as if you were Donald Trump. Next thing I know you’ll be grabbing me by the entropy.

    Grow up, walto. Your frustration is due to your own limitations. I am not responsible for those.

  26. fifth,

    Why does M3 correspond to 307 microstates rather than something like 17? Is there a reason for this or is it just how it is?

    Those numbers are made up, of course, but if you’re asking why later macrostates correspond to greater numbers of microstates, it’s because, roughly speaking, microstates are epistemically equiprobable, so as time goes on and the system evolves, you’re more likely to end up in a microstate that corresponds to a macrostate with a large ensemble.

    None of what I’m saying here is specific to the “entropy as missing information” idea. It’s standard thermodynamics, so there’s plenty of relevant stuff on the Web and in books.

  27. Mung,

    But even Lambert says the probability distributions of thermodynamics are objective.

    That’s right. For a given macrostate, the probability distributions are determined solely by the physics.

    The observer-dependence comes in via the choice of macrostate.

  28. Mung: But even Lambert says the probability distributions of thermodynamics are objective.

    https://en.wikipedia.org/wiki/Boltzmann_distribution

    https://en.wikipedia.org/wiki/Maxwell%E2%80%93Boltzmann_distribution

    One of those links says this:

    The distribution shows that states with lower energy will always have a higher probability of being occupied than the states with higher energy.

    So the question seems to me to be how this principle can be consistent with an ignorance theory of entropy. Do coins have two possibilities whatever one knows, or–if we know enough–does the number of possible states drop to one?

  29. walto:

    Why must something being at a temperature of absolute zero imply anything about missing knowledge?

    keiths:

    The missing knowledge is the difference between knowing the macrostate and knowing the microstate. When the temperature is absolute zero, there is no difference, and thus no missing knowledge, because there is only one possible microstate for that macrostate.

    walto:

    Is the idea that one COULD know that the possibilities are limited by that lack of heat?

    keiths:

    No, it’s that one does know that the possibilities are limited by the lack of heat. As temperature goes down, the number of possible microstates decreases until there is just one left (with some exceptions, like the “frustrated materials” I mentioned above).

    walto:

    Who knows that? Everyone? Has everyone always known that?

    Good grief, walto. If the observer doesn’t know physics, or doesn’t at least possess the correct equations and enough savvy to plug in the correct parameters, then of course s/he can’t calculate the entropy correctly.

    To get from the macrostate to the corresponding ensemble of microstates requires physics.

    Either the number of microstates is a function of actual knowledge or it as a function of the temperature (which may or may not actually be known), and, thus POTENTIAL knowldge–based on the availability, rather than the reception of the of the information.

    That’s a bit garbled, so I’m not sure what you’re claiming. The number of microstates depends on the macrostate chosen. Specifying or measuring the temperature is one way of specifying a macrostate. Specifying or measuring the temperature and the volume is another. Specifying or measuring the temperature and the volume and the isotopic concentrations is another. Determining the exact microstate, as Damon the Demon does, is yet another.

    They’re all valid, and each gives rise to a different entropy value.

    If the latter, every microstate has the same entropy–whatever it’s possible for Damon to calculate.

    As I keep telling you, entropy is a function of the macrostate, not the microstate. That’s really, really important, so please internalize it.

  30. keiths:

    The observer-dependence comes in via the choice of macrostate.

    walto:

    Can you explain what that means?

    That’s what I was getting at here:

    The number of microstates depends on the macrostate chosen. Specifying or measuring the temperature is one way of specifying a macrostate. Specifying or measuring the temperature and the volume is another. Specifying or measuring the temperature and the volume and the isotopic concentrations is another. Determining the exact microstate, as Damon the Demon does, is yet another.

    They’re all valid, and each gives rise to a different entropy value.

  31. Keiths seems eager to go down the deepity woo route of describing entropy.

    I’ve tried to show more practical derivations of entropy for things like an ice cube, a block of copper, a container of gas.

    The ability to derive a quantity for entropy helps analyze how to improve the design of machines that have to convert heat energy to mechanical or electrical energy.

    Here is a blurb from a wiki entry.

    Though the wiki entry is technical, I think it has a better picture of how thermodynamics is used in the real world versus Keiths deepity observer-dependent woo approach.

    https://en.wikipedia.org/wiki/Rankine_cycle

    The Rankine cycle is a model that is used to predict the performance of steam turbine systems. The Rankine cycle is an idealized thermodynamic cycle of a heat engine that converts heat into mechanical work. The heat is supplied externally to a closed loop, which usually uses water as the working fluid. It is named after William John Macquorn Rankine, a Scottish polymath and Glasgow University professor.
    ….
    In an ideal Rankine cycle the pump and turbine would be isentropic, i.e., the pump and turbine would generate no entropy and hence maximize the net work output.

  32. Sal,

    The equations of thermodynamics are the same whether you embrace the “energy dispersal” or “missing information” interpretation of entropy. When you write things like this…

    Though the wiki entry is technical, I think it has a better picture of how thermodynamics is used in the real world versus Keiths deepity observer-dependent woo approach.

    …you’re just blowing smoke. An informationist can design engines just as well as a dispersalist. The equations are the same either way.

    The difference between informationists and dispersalists is that dispersalists have a flawed conceptual understanding of entropy.

  33. From earlier in the thread, six reasons why the dispersalists are wrong:

    1. Entropy has the wrong units. Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

  34. stcordova: Keiths seems eager to go down the deepity woo route of describing entropy.

    LoL. If that were true he’d be trying to calculate the entropy of 500 fair coins, all heads. Or of an airplane, or of a dead rat.

    Or arguing that because particles move about, energy moves about with them, therefore energy is dispersed, therefore entropy is the spreading about of energy, like butter on toast.

  35. stcordova: The ability to derive a quantity for entropy helps analyze how to improve the design of machines that have to convert heat energy to mechanical or electrical energy.

    Yet what you have failed to do is show that the information theory approach makes it impossible to derive a quantity for entropy, or gives the wrong answer when applied. Seems a rather big oversight.

  36. In this book the author approaches statistical mechanics in the uniquely consistent and unified way made possible by information theory – something that has not been done in as much detail or in book form before. Information theory is a considerably later intellectual development than statistical mechanics, but it [information theory] has proved to be very powerful in clarifying the concepts of statistical mechanics and in uniting its various branches to one another and to the rest of physics.

    Katz, Amonon. Principles of Statistical Mechanics: The Information Theory Approach. 1967.

    The criterion for acceptance of new ideas in physics is the experimental verification of new results which they may predict. Any conceptual and philosophical clarification that the new idea may bring about is considered merely as a benefit of secondary importance. For this reason, ideas that predict no new verifiable results, but only offer clarification of previous concepts, are themselves considered of secondary imporatance. Their acceptance or rejection remains a matter of personal inclination.

    The information theory approach to statistical mechanics has not, to my knowledge, led to any concrete new results. For this reason it has remained “controversial,” the controversy being a matter of preference. Lte me say a few words to explain my own bias in this matter.

    Information theory makes statistical mechanics a complete theory by setting up rules for asking any question. It contributes nothing, however, to the technical problem of getting the answer.

    – Amnon Katz

  37. This is also worth repeating:

    The error of the dispersalists is that they’ve mistaken correlation for identity. There is a strong correlation: entropy increases are often accompanied by energy dispersal. The dispersalists have leaped to the conclusion that entropy is energy dispersal.

    It isn’t, for the reasons given above.

  38. Salvador has no basis for mocking anyone over “missing information.” He’s the one that is missing relevant information.

    Katz, Amonon. Principles of Statistical Mechanics: The Information Theory Approach. 1967.

    Chapter 2.3: Missing information when a probability distribution is given.

    Chapter 3: Statistical Mechanics
    Chapter 3.4. Classical statistics
    Chapter 3.5. The missing information
    Chapter 3.7. Quantum statistics
    Chapter 3.8. The missing information

    Chapter 4.3: Maximum missing information – classical
    Chapter 4.4: Maximum missing information – quantum mechanical

  39. Mung,

    Right. Even the “disorderists” get the right answers when they do entropy calculations. The equations are the same for disorderists, dispersalists, and informationists.

    The differences are differences in the interpretation of entropy and in conceptual clarity and consistency, as my six points show.

  40. walto,

    It’s been a pleasure having you in this discussion.

    I’m not really following much anymore what Keiths is saying, so for the mean time I’m going to put him on my ignore list.

    If you have some question for me, I’d be happy to give it my best shot however.

    Take care.

  41. 7. Identical Particles
    7.1. Identical particles
    7.2. A classical ideal gas of identical particles

    Another prominent difference between distinguishable and identical particles appears in the expression for the missing information (entropy) I: (2.8)

    The value of I before the connection was … (2.11) … which is exactly the value of I after the connection. No information was lost. for distinguishable particles we should have … and this differs from the value of I after the connection …

    These results are to be expected. For distinguishable particles there is more information in the statement that particles a = 1,…,N1 are in the subvolume V1 and particles a = N1 + 1,…,N1 + N2 are in the subvolume V2 than in the mere statement that N1 + N2 particles are somewhere in the volume V1 + V2.
    But for identical particles no such distinctions exist. An increase in missing information of the above form deltaI occurs when one mixes gases of different kinds …

    – Amnon Katz

    There you have an information theory (“missing information” ) explanation of why in one case there is a change in “entropy” and in the other there is not.

  42. stcordova:
    walto,

    It’s been a pleasure having you in this discussion.

    I’m not really following much anymore what Keiths is saying, so for the mean time I’m going to put him on my ignore list.

    If you have some question for me, I’d be happy to give it my best shot however.

    Take care.

    Thanks–I may take you up on that at some point. Old friend Bruce S. sent me some material to read on this issue and I’m sure not to understand it all.

    Anyhow, I learned quite a bit here, but still don’t know much.

  43. Sal,

    I’m not really following much anymore what Keiths is saying, so for the mean time I’m going to put him on my ignore list.

    Heh. Right, and I’m sure it has nothing to do with the fact that I’ve raised six points against dispersalism that you can’t refute.

    Here’s what you were saying yesterday:

    You want some more punishment Keiths, show up here again, because I’m enjoying dishing it out.

    Where has all that bravado gone? It seems to have… dispersed.

  44. stcordova: I’m not really following much anymore what Keiths is saying, so for the mean time I’m going to put him on my ignore list.

    LoL. So now keiths is a troll like Mung is a troll. Therefore ignore.

  45. Mung, after quoting Katz:

    There you have an information theory (“missing information” ) explanation of why in one case there is a change in “entropy” and in the other there is not.

    Yep. It makes perfect sense in terms of missing information, but no sense at all in terms of the energy dispersal, which remains exactly the same whether the particles are distinguishable or not.

Leave a Reply