In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. That’s all right. I’ve ignored Sal for months now. Just scroll past. Haven’t missed anything, judging from the responses to his drivel.

  2. Joe ,

    Granville Sewell’s “X-entropy” equations simply model dispersion of matter of a chemical element or compound. He uses this to argue that concentrations of (say) carbon in a living form violate the predictions of those equations. But the equations have no terms for interactions with other chemicals (nor any for electrostatic attraction or repulsion, or for effects of gravity). So the carbon in his model can’t be oxidized into carbon dioxide or made into sugars by photosynthesis, or have any interesting chemistry happen to it. Light can’t hit plants, photosynthesis can’t make energetic compounds, herbivores can’t eat that, carnivores can’t eat them, and bacteria can’t decompose them. Even plants can’t grow.

    It’s hard to overstate just how poor Granville’s thinking is on entropy and related topics. Here’s a comment I made at UD about the paper he presented at the Biological Information ID conference:

    Timaeus,

    Scientific papers are judged by their contents. The contents of Granville’s paper are awful. Based on those contents, and using Granville’s own words, I have shown that Granville:

    1. Mistakenly asserts that “the increase in order which has occurred on Earth seems to violate the underlying principle behind the second law of thermodynamics, in a spectacular way.”

    2. Titles his paper Entropy, Evolution and Open Systems without realizing that the second law is actually irrelevant to his improbability argument, since it is not violated by evolution.

    3. Misunderstands the compensation argument and incorrectly rejects it.

    4. Fails to understand that the compensation argument is a direct consequence of the second law, and that by rejecting it he is rejecting the second law itself!

    5. Fails to realize that if the compensation argument were invalid, as he claims, then plants would violate the second law whenever their entropy decreased.

    6. Asserts, with no evidence, that physics alone cannot explain the appearance of complex artifacts on Earth.

    7. Offers, as evidence for the above, a thought experiment involving a simulation he can neither run nor analyze.

    8. Declares, despite being unable to run or analyze the simulation, that he is “certain” of the outcome, and that it supports his thesis.

    9. Confuses negentropy with complexity, as Lizzie explained.

    10. Conflates entropy with disorder, as Lizzie explained.

    Granville was unable to defend his paper, so he bailed out of the thread. You are now retreating also — probably a wise move. It remains to be seen what Eric and CS3 will do.

    If Lizzie and I are able to expose egregious faults in Granville’s paper, using his own words, and none of you are capable of defending it, then how can you claim that his paper was good science that deserved to be accepted by the BI organizers?

    By accepting Granville’s paper, the organizers showed that the BI was not a serious scientific conference. Springer did the right thing in refusing to publish.

  3. walto,

    I’ve posed this question to you multiple times, but you’ve refused to answer:

    If there is a single correct value for the entropy, then at least two of the observers must be wrong. You seem to think that all three are wrong: you reject Xavier’s and Yolanda’s values because they are in worse “epistemic positions” than Damon; yet you also reject Damon’s value of zero, despite the fact that he is in a perfect epistemic position.

    What is the single correct value, then, and how do you go about determining it?

    I’ve been pushing you to tackle this question for pedagogical reasons. You won’t be able to answer it, and the reason for that failure will teach you something important about entropy.

    Give it a shot, and then we can discuss it.

  4. Here, from his Entropy Demystified Ben-Naim gives the basic argument for his case (which keiths and mung take up here) that entropy is (and has always been) information:

    Before ending this section on entropy and information, I should mention a nagging problem that has hindered the acceptance of the interpretation of entropy as information. We recall that entropy was defined as a quantity of heat divided by temperature. As such, it has the units of energy divided by K (i.e., Joules over K or J/K, K being the units of the absolute temperature in Kelvin scale). These two are tangible, measurable and well-defined concepts. How is it that “information,” which is a dimensionless quantity, a number that has nothing to do with either energy or temperature, could be associated with entropy, a quantity that has been defined in terms of energy and temperature? I believe that this is a very valid point of concern which deserves some further examination. In fact, even Shannon himself recognized that his measure of information becomes identical with entropy only when it is multiplied by a constant k (now known as the Boltzmann constant), which has the units of energy divided by temperature. This in itself does not help much in proving that the two apparently very different concepts are identical. I believe there is a deeper reason for the difficulty of identifying entropy with information. I will elaborate on this issue on two levels.

    First, note that in the process….the change in entropy does involve some quantity of heat transferred as well as the temperature. But this is only one example of a spontaneous process. Consider the expansion of an ideal gas in or the mixing of two ideal gases. In both cases, the entropy increases. However, in both cases, there is no change in energy, no heat transfer, and no involvement of temperature. If you carry out these two processes for ideal gas in an isolated condition, then the entropy change will be fixed, independent of the temperature at which the process has been carried out and obviously no heat transfer from one body to another is involved. These examples are only suggestive that entropy change does not necessarily involve units of energy and temperature.

    The second point is perhaps on a deeper level. The units of entropy (J/K) are not only unnecessary for entropy, but they should not be used to express entropy at all. The involvement of energy and temperature in the original definition of entropy is a historical accident, a relic of the pre-atomistic era of thermodynamics.

    Recall that temperature was defined earlier than entropy and earlier than the kinetic theory of heat. Kelvin introduced the absolute scale of temperature in 1854. Maxwell published his paper on the molecular distribution of velocities in 1859. This has led to the identification of temperature with the mean kinetic energy of atoms or molecules in the gas. Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W, and entropy would be rendered dimensionless!

    Had the kinetic theory of gases preceded Carnot, Clausius and Kelvin, the change in entropy would still have been defined as energy divided by temperature. But then this ratio would have been dimensionless. This will not only simplify Boltzmann’s formula for entropy, but will also facilitate the identification of the thermodynamic entropy with Shannon’s information. [I have removed the footnotes–W]

    I encourage those still reading this “amusing” thread to read Ben-Naim’s argument carefully and make their own judgments about its force. Then they can come to their own conclusions about this matter (which, as I’ve said, seems to me of very little scientific importance–other than maybe pedagogical).

    There may be some philosophical interest to this stuff, but the issues are complicated and, to me at least, difficult.

  5. In attempt to de-mystify entropy, I will provide a Microsoft Excel spreadsheet where the user can enter values in the orange colored boxes for:

    moles gas:
    volume:
    temperature:

    The result of the computation is the absolute entropy of helium (to a reasonable approximation).

    I put some default numbers in and you can see the red highlighted text the absolute entropy.

    Here is the spread sheet:
    Absolute Entropy Helium

    The ULR is:
    creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    Change the numbers in the orange text and see the change in entropy. The numbers in the orange text defines the energy dispersal parameters. All I did was put the formula for the alternate form of Sakur-Tetrode equation below.

    The details are tedious, but at root, this is just mapping one number (entropy) to 3 numbers (moles gas, volume, temperature). The temperature is used by the spreadsheet to compute the amount of energy in the Sakur-Tetrode equation and the other things that are plugged into the Sakur-Tetrode equation below. If any one wants to suffer to how the numbers are plugged in they can study the spreadsheet. Ugh!

    For other materials and situations, the equations are different, but I don’t see the need to be invoking all this ignorance definition of entropy and observer-dependent woo and information theory.

  6. walto,

    Here, from his Entropy Demystified Ben-Naim gives the basic argument for his case (which keiths and mung take up here) that entropy is (and has always been) information:

    <snip>

    I encourage those still reading this “amusing” thread to read Ben-Naim’s argument carefully and make their own judgments about its force.

    The two points that Ben-Naim (correctly) makes in that short quote were already covered earlier in the thread, and that quote is by no means a complete argument for the “missing information” view of entropy. Readers should not base their judgments on that tiny snippet.

    I’ve given at least six reasons why the “energy dispersal” view cannot be correct. Here they are again:

    1. Entropy has the wrong units, Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

    Each of those points is sufficient by itself to scuttle the “entropy as energy dispersal” misconception. Good luck to any dispersalist who tries to rebut all of them.

  7. keiths,

    I agree with your comments on Granville Sewell’s paper. Trying to give his argument its best shot, I find that reading his equations, all that is there is dispersal of the matter (say, carbon). If there is something there that allows chemical interactions, electrostatic effects, or even gravity, I’d be pleased to be corrected. So far others have reached the same conclusion — he leaves out all the interesting stuff that enables plants to grow, organisms to reproduce, etc. Then he declares that this equation shows that evolution can’t happen. Unfortunately for him he has also shown that plants can’t grow.

    Whether his “X-entropy” is an actual entropy may connect with the argument here. But as far as I can see whether or not it can be called entropy is irrelevant, in that either way there is no justification for his assertions about what can happen.

  8. walto,

    Then they can come to their own conclusions about this matter (which, as I’ve said, seems to me of very little scientific importance–other than maybe pedagogical).

    It’s of huge scientific importance. Concepts matter in science.

    It’s true that a scientist or engineer who is merely plugging numbers into the equations won’t see a difference. The equations work whether you understand them or not. So for the purposes of practical calculations, it doesn’t matter whether you see entropy as a measure of disorder, energy dispersal, or missing information.

    But if you actually want to understand entropy, and not just plug numbers into equations, then the concepts are crucial.

    Entropy is not disorder, and Lambert was right to argue against that common misconception. His mistake was to replace one misconception with another. Entropy is not energy dispersal, either, and if you think it is, you don’t understand entropy.

    Entropy is a measure of missing information — the gap between the information associated with the macrostate and the information associated with the microstate.

  9. Sal,

    For other materials and situations, the equations are different, but I don’t see the need to be invoking all this ignorance definition of entropy and observer-dependent woo and information theory.

    Here are six good reasons. Can you rebut any of them?

  10. keiths: Here are six good reasons. Can you rebut any of them?

    I think you should post them again. You haven’t done so for a couple of minutes and you’re obviously quite fond of them.

  11. Revisiting the question of the shuffled cards…..

    As I showed, we can calculate the entropy of any amount of copper. In theory we could calculate the entropy of paper and ink needed to make cards.

    Will that (thermodynamic) entropy figure change whether we shuffle the cards or not? NO!

    One could imagine “cards” engraved with the suits and rankings (Ace,2,3,4,5…10,Jack, Queen, King) on copper. We construct a “deck” made of these copper cards. We know from here if the deck had a total weight of 1555 grams, the entropy is

    812.42 J/K

    Doesn’t matter how we order or “shuffle” the deck, the entropy is going to be 812.42 J/K unless we change the temperature.

  12. Sal,

    Doesn’t matter how we order or “shuffle” the deck, the entropy is going to be 812.42 J/K unless we change the temperature.

    That’s almost, but not quite, correct. The order of the cards will have a small but nonzero effect on the thermodynamic entropy.

    See this comment.

  13. stcordova: The result of the computation is the absolute entropy of helium (to a reasonable approximation).

    Only if it’s warm enough,Sal.
    😉
    Get below a kelvin and it goes distinctly pear-shaped. Why is that, Sal?

    I also have a couple of questions about your calculation of the entropy of 1555 grams of copper = 812.42 J/K.
    One of your inputs was the specific heat of copper = 0.39 J/gram/K.
    Do tell:
    1) Do you normally quote results to 5 sig figs, when one of your inputs has 2 sig fig precision?
    2) How was that 0.39 J/g/K value derived? Experimentally?
    3) Is it constant?

    I would encourage you to think about the implications of your answers to 2 and 3.

  14. walto,

    With Sal in full retreat, it looks like you’re the only one left defending the “entropy as energy dispersal” view.

    Yet you can’t rebut my six points or answer my question.

    Any readers out there who would like to step in for walto?

  15. Here are Lambert’s answers to keiths’ SIX BELOVED POINTS:

    Information theorists and thermodynamics text authors like Herbert B. Callen and Myron Tribus take Shannon’s valid theory of errors and disorder in communication and by ‘merely’ substituting kB for k in -k∑ pi ln pi claim that “entropy is the quantitative measure of disorder in the relevant distribution of the system over its microstates.” [Callen, italics mine, to indicate a completely erroneous implication of the nature of a system’s occupancy of one microstate at one time in an equally erroneous connection of thermodynamic entropy with communications’ disorder.] Admittedly, both authors develop many pages of standard thermodynamics of systems, but a reader who scans only a page or hears only the Callen statement without an emphasis on how molecules behave spontaneously is seriously misled. This is not a normal substitution of one arbitrary mathematical constant for another. It constitutes insertion of the qualities and the behavior of all physical material systems into a purely mathematical equation without any consideration of the innate behavior of those physical systems. Although this may not be exactly like evaluating one’s obnoxious relatives in Chicago by some symbols in an equation from Einstein’s theory of general relativity, it is far more subtly erroneous because Shannon probability is a vital half of thermodynamic entropy, as described here:

    “There are two requisites for entropy change in chemistry. An increase in thermodynamic entropy is enabled in a process by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change). However, entropy increase is only actualized if the process results in a larger number of arrangements (microstates) for the system’s energy, i.e., a final state that involves the most probable distribution for that energy under the new constraints.” [That probable distribution of Boltzmann is identical to Shannon’s.]

    “The two requisites, energy and probability, are both necessary for thermodynamic entropy change; neither is sufficient alone. In sharp contrast, information theory (often misleadingly called information “entropy”) is alone sufficient for its mathematical purposes, depending only on —k ∑ pi log pi , where k is an arbitrary constant that is not required to involve energy.”

    [Frank L. Lambert, J. Chem. Educ. 2007, 84, 1548-1550, http://entropysite.oxy.edu/ConFigEntPublicat.pdf.%5D

    Arbitrarily replacing k by kB — rather than have it arise from thermodynamic necessity via Boltzmann’s probability of random molecular motion — does violence to thermodynamics. The set of conditions for the use or for the meaning of kB, of R with N, are nowhere present in information theory. Thus, conclusions drawn from the facile substitution of kB for k, without any discussion of the spontaneity of change in a thermodynamic system (compared to the hundreds of information “entropies”) and the source of that spontaneity (the ceaseless motion of atoms and molecules) are doomed to result in confusion and error. There is no justification for this attempt to inject the overt subjectivity of “disorder” from communications into thermodynamic entropy.

    http://entropysite.oxy.edu/boltzmann.html

  16. walto,

    Here are Lambert’s answers to keiths’ SIX BELOVED POINTS:

    How does any of that refute my six points? Be specific.

    Here’s a suggestion. Quote each of my points, one by one, and underneath each of them, explain how Lambert’s quote refutes it.

  17. Patrick,

    I would prefer a plugin that simply hid the comment until it was clicked on or inserted a link.

    Me too.

    If you keep doing this I have no reason to look for one. 😉

    When I finally succeed in escaping the clutches of my employer, I’ll look into an implemention of the “moderation as a subscription service” idea that Lizzie liked.

  18. keiths:
    walto,

    How does any of that refute my six points?Be specific.

    Here’s a suggestion.Quote each of my points, one by one, and underneath each of them, explain how Lambert’s quote refutes it.

    Thanks, but here’s my suggestion. You respond to each of Lambert’s objections to the information theory you have been plumping here and explain how you can rehabilitate your view in spite of his remarks. Then, if I’m not convinced by your attempt to rehab your view, I’ll chime in.

    Be specific.

  19. Moved a comment to Guano. Address the ideas, not the person and assume everyone is participating in good faith.

    Ok, if I provide the Sakur-Tetrode equation that is for monatomic ideal gases, comments that insinuate that I ever suggested Sakur-Tetrode applies necessarily to the liquid state are retarded and idiotic and trollish.

  20. Heh.

    walto:

    Here’s a quote from Lambert that answers all of your points.

    keiths:

    It does? How so?

    walto:

    Um, I’d rather not answer that question.

  21. walto,

    As for Lambert’s objection, you’ve already quoted the refutation. You just didn’t realize it.

    Lambert’s complaint is that the equation for thermodynamic entropy includes Boltzmann’s constant (kb), while the equation for informational entropy does not. He thinks that the informationists are therefore cheating by bringing kb into the equation:

    Arbitrarily replacing k by kB — rather than have it arise from thermodynamic necessity via Boltzmann’s probability of random molecular motion — does violence to thermodynamics. The set of conditions for the use or for the meaning of kB, of R with N, are nowhere present in information theory. Thus, conclusions drawn from the facile substitution of kB for k, without any discussion of the spontaneity of change in a thermodynamic system (compared to the hundreds of information “entropies”) and the source of that spontaneity (the ceaseless motion of atoms and molecules) are doomed to result in confusion and error. There is no justification for this attempt to inject the overt subjectivity of “disorder” from communications into thermodynamic entropy.

    That’s nonsense. The only reason kb even appears in the thermodynamic entropy equation is because of the choice of units. I explained that earlier in the thread:

    The fact that it’s [thermodynamic entropy is] usually expressed in units of joules per kelvin (J/K) is an accident of history, due to the definition of the kelvin as a base unit. Had the kelvin been defined in terms of energy, joules in the denominator would have cancelled out joules in the numerator and the clunky J/K notation would be unnecessary.

    Ben-Naim makes the same point here, in the very excerpt that you quoted, walto:

    The second point is perhaps on a deeper level. The units of entropy (J/K) are not only unnecessary for entropy, but they should not be used to express entropy at all. The involvement of energy and temperature in the original definition of entropy is a historical accident, a relic of the pre-atomistic era of thermodynamics.

    Recall that temperature was defined earlier than entropy and earlier than the kinetic theory of heat. Kelvin introduced the absolute scale of temperature in 1854. Maxwell published his paper on the molecular distribution of velocities in 1859. This has led to the identification of temperature with the mean kinetic energy of atoms or molecules in the gas. Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W, and entropy would be rendered dimensionless!

    Had the kinetic theory of gases preceded Carnot, Clausius and Kelvin, the change in entropy would still have been defined as energy divided by temperature. But then this ratio would have been dimensionless. This will not only simplify Boltzmann’s formula for entropy, but will also facilitate the identification of the thermodynamic entropy with Shannon’s information.

    Conclusion: Lambert is wrong. The use of kb is just an accident of history, caused by the fact that the kelvin is (unnecessarily) a base unit in the SI system.

    Physics obviously matters to thermodynamic entropy, but it does not get into the equations via kb. It gets in via the W in Boltzmann’s equation and via the probabilities in the Gibbs equation.

  22. stcordova: Ok, if I provide the Sakur-Tetrode equation that is for monatomic ideal gases, comments that insinuate that I ever suggested Sakur-Tetrode applies necessarily to the liquid state are retarded and idiotic and trollish.

    Sal, you seem very quick to take offense. You used Sakur-Tetrode to obtain “the absolute entropy of helium (to a reasonable approximation).”, without any qualification whatsoever. I was merely pointing out that Sakur-Tetrode goes of the rails at low temperatures; maybe I was a little unfair in referring to (very low) temperatures below one Kelvin — I was enjoying myself too much getting your spreadsheet to return negative values. You are correct that, at one Kelvin, 4He is no longer a gas, but the 3He can remain in gas form under these conditions, so, according to your specification, Sakur-Tetrode should still apply. Does it?
    Gosh there’s an interesting thought: what happens to the specific heat capacity of 4He and 3He under these circumstances?
    You never did answer my questions about specific heat capacity. Is it a constant? How do you know?

    P.S. Have you read Ben-Naim’s derivation of Sakur-Tetrode? He breaks it down into four components.

  23. keiths: walto,

    As for Lambert’s objection, you’ve already quoted the refutation. You just didn’t realize it.

    Lambert’s complaint is that the equation for thermodynamic entropy includes Boltzmann’s constant (kb), while the equation for informational entropy does not. He thinks that the informationists are therefore cheating by bringing kb into the equation:

    Arbitrarily replacing k by kB — rather than have it arise from thermodynamic necessity via Boltzmann’s probability of random molecular motion — does violence to thermodynamics. The set of conditions for the use or for the meaning of kB, of R with N, are nowhere present in information theory. Thus, conclusions drawn from the facile substitution of kB for k, without any discussion of the spontaneity of change in a thermodynamic system (compared to the hundreds of information “entropies”) and the source of that spontaneity (the ceaseless motion of atoms and molecules) are doomed to result in confusion and error. There is no justification for this attempt to inject the overt subjectivity of “disorder” from communications into thermodynamic entropy.

    That’s nonsense. The only reason kb even appears in the thermodynamic entropy equation is because of the choice of units. I explained that earlier in the thread:

    The fact that it’s [thermodynamic entropy is] usually expressed in units of joules per kelvin (J/K) is an accident of history, due to the definition of the kelvin as a base unit. Had the kelvin been defined in terms of energy, joules in the denominator would have cancelled out joules in the numerator and the clunky J/K notation would be unnecessary.

    Ben-Naim makes the same point here, in the very excerpt that you quoted, walto:

    The second point is perhaps on a deeper level. The units of entropy (J/K) are not only unnecessary for entropy, but they should not be used to express entropy at all. The involvement of energy and temperature in the original definition of entropy is a historical accident, a relic of the pre-atomistic era of thermodynamics.

    Recall that temperature was defined earlier than entropy and earlier than the kinetic theory of heat. Kelvin introduced the absolute scale of temperature in 1854. Maxwell published his paper on the molecular distribution of velocities in 1859. This has led to the identification of temperature with the mean kinetic energy of atoms or molecules in the gas. Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W, and entropy would be rendered dimensionless!

    Had the kinetic theory of gases preceded Carnot, Clausius and Kelvin, the change in entropy would still have been defined as energy divided by temperature. But then this ratio would have been dimensionless. This will not only simplify Boltzmann’s formula for entropy, but will also facilitate the identification of the thermodynamic entropy with Shannon’s information.

    Conclusion: Lambert is wrong. The use of kb is just an accident of history, caused by the fact that the kelvin is (unnecessarily) a base unit in the SI system.

    Physics obviously matters to thermodynamic entropy, but it does not get into the equations via kb. It gets in via the W in Boltzmann’s equation and via the probabilities in the Gibbs equation.

    I’m glad you put the Ben-Naim quote I excerpted up again. I guess you concur after all that it is key to this issue. As I said when I posted it, I encourage everyone to read it carefully and determine for themselves how much force it has.

    FWIW, this is the passage that I think is central (but again, WTHDIK):

    Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W….

  24. walto,

    I’m glad you put the Ben-Naim quote I excerpted up again.

    No, you aren’t glad. It shows that you had already quoted a refutation of Lambert’s objection without even realizing that it amounted to a refutation.

    I guess you concur after all that it is key to this issue. As I said when I posted it, I encourage everyone to read it carefully and determine for themselves how much force it has.

    You claimed it was Ben-Naim’s “basic argument for his case.” It isn’t, as I explained:

    The two points that Ben-Naim (correctly) makes in that short quote were already covered earlier in the thread, and that quote is by no means a complete argument for the “missing information” view of entropy. Readers should not base their judgments on that tiny snippet.

    walto:

    FWIW, this is the passage that I think is central (but again, WTHDIK):

    Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W….

    Both you and I quoted it. Problem is, you didn’t realize that it was a refutation of Lambert, who in your later quote falls into exactly the trap I described: thinking that the physics comes into the equation via kb. It doesn’t, and in fact kb is altogether unnecessary, as Ben-Naim and I point out.

  25. walto,

    I’ve shown you that Lambert’s objection is fallacious. Yet you claimed that it was somehow a response to all six of my points against dispersalism:

    Here are Lambert’s answers to keiths’ SIX BELOVED POINTS:

    How does that single fallacious objection refute each of my six points? Be specific.

  26. Thermodynamics
    an introduction to the physical theories of equilibrium thermostatistics and irreversible thermodynamics
    Herbert B. Callen
    John Wiley and Sons, Inc.
    1960

    Postulate I. There exist particular states (called equilibrium states) of simple systems that, macroscopically, are characterized completely by the internal energy U, the volume V, and the mole numbers N1, N2, … Nr of the chemical components.

    Postulate II. There exists a function (called the entropy S) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.

    Postulate III. The entropy of a composite system is additive over the constituent subsystems. The entropy is continuous and differentiable and is a monotonically increasing function of the energy.

    Postulate IV. The entropy of any system vanishes in the state for which … (that is, at the zero of temperature)

    This postulate is an extension, due to Plank, of the so-called Nernst postulate or third law of thermodynamics.

    The foregoing postulates are the logical bases of our development of thermodynamics.

  27. Thermodynamics is the study of the macroscopic consequences of myriads of atomic coordinates, which, by virtue of the statistical averaging, do not appear explicitly in a macroscopic description of a system.

    Now, what are the chief consequences of the existence of the “hidden” atomic modes of motion with which thermodynamics is concerned?

    – Callen (1960). p. 7

    Missing information, anyone?

  28. I’m repeating myself, yes. But Salvador claimed in his OP that there is no relationship between entropy and information. He was wrong.

  29. The particular simple states to which thermodynamics applies are called equilibrium states.

    – Callen (1960). p. 11

    If Salvador, or keiths, are not talking about equilibrium states, I question whether they are talking about entropy.

  30. Lambert:

    A fundamental problem engendered by general chemistry texts employing positional entropy (and in others not emphasizing that expression) is that gas expansion or fluid mixing is due to “the driving force of probability”, as one textbook states. Certainly probability, in the sense of a spatially broader and thus of a probably greater distribution for the motional energy of each constituent’s molecules , is an essential consideration in mixing or expansion, but this is not the interpretation of probability given in texts employing positional entropy.

    Probability is an essential consideration.in mixing or expansion, in the sense of a spatially broader and thus of a probably greater distribution for the motional energy of each constituent’s molecules.

    Perhaps Sal can explain this.

  31. If Salvador, or keiths, are not talking about equilibrium states, I question whether they are talking about entropy.

    I can’t speak for Sal, but I’m talking about equilibrium states when I speak of entropy.

    However, Callen’s statement is incorrect:

    The particular simple states to which thermodynamics applies are called equilibrium states.

    Google ‘non-equilibrium thermodynamics’.

  32. Mung: Does your argument depend on non-equilibrium thermodynamics?

    You mean keiths’ argument? Maybe it would be helpful if somebody posted it again so we could all see it.

    keiths: 1. Entropy has the wrong units, Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

  33. It seems to me that point of thermodynamics is that no one knows the actual microstate. If that’s not an argument for “missing information” I don’t know what would be.

  34. Lambert:

    There are two requisites for thermodynamic entropy change. An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change).

    However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s energy, that is, a final state that involves the most probable distribution of that energy under the new constraints.

  35. Lambert:

    An increase in thermodynamic entropy is enabled in chemistry by the motional energy of molecules (that, in chemical reactions, can arise from the energy released from a bond energy change). However, entropy increase is only actualized if the process makes available a larger number of arrangements for the system’s energy, a final state that involves the most probable distribution for that energy under the new final conditions. This can be seen in terms of removal of a constraint.

  36. Mung:
    It seems to me that point of thermodynamics is that no one knows the actual microstate. If that’s not an argument for “missing information” I don’t know what would be.

    As seen above, Lambert agrees:

    Mung: “The two requisites, energy and probability, are both necessary for thermodynamic entropy change; neither is sufficient alone. In sharp contrast, information theory (often misleadingly called information “entropy”) is alone sufficient for its mathematical purposes, depending only on —k ∑ pi log pi , where k is an arbitrary constant that is not required to involve energy.”

  37. walto,

    As seen above, Lambert agrees:

    The two requisites, energy and probability, are both necessary for thermodynamic entropy change; neither is sufficient alone. In sharp contrast, information theory (often misleadingly called information “entropy”) is alone sufficient for its mathematical purposes, depending only on —k ∑ pi log pi , where k is an arbitrary constant that is not required to involve energy.

    Again, Lambert’s mistake is to think that if k doesn’t involve energy, then the entropy in question can’t be thermodynamic entropy.

    That’s simply wrong. The J/K units are an accident of history, as explained above. Had the kelvin been defined in terms of energy, an entropy in J/K would have been energy/energy, and energy would cancel, leaving a dimensionless quantity.

    Physics comes into the entropy equations not via the kb constant, but via the number of microstates (the W in the Boltzmann/Planck equation) and the probabilities (the pi in the Gibbs equation).

    You can’t determine W and the pi’s without considering physics. That’s what puts the “thermodynamic” into “thermodynamic entropy”.

  38. keiths: You can’t determine W and the pi’s without considering physics. That’s what puts the “thermodynamic” into “thermodynamic entropy”.

    Yeah, I found Lambert’s argument just weird.

  39. Is that just obstinacy, or do you genuinely think that Lambert’s position is defensible?

Leave a Reply