In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. colewd: [ Sal says:] Professionals who work with entropy ( engineers and chemists) use the Claussius [sic] version. The mistake I see in this thread is the insistence there is only one way to define entropy correctly. There is not.

    I am interested to see if others agree with this. The multiple definition issue seems to be the cause of this lively discussion.

    This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts. Clausius / Carnot is used to introduce high school students to entropy using dQ/T to discuss the theoretical efficiency of steam engines (well, ideal gases in pistons), and to explain to teenage chemistry students why endothermic reactions may be favored.

    Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…
    I could ask my physicist and materials scientist buddies what they use, but I somehow doubt that would change anyone’s opinions…

    There is only one way to define entropy correctly: minus rho log(rho).

  2. stcordova: Professionals who work with entropy (engineers and chemists) use the Claussius version.

    If only they could put the entropy in a bottle and sell it.

  3. DNA_Jock: This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts. Clausius / Carnot is used to introduce high school students to entropy using dQ/T to discuss the theoretical efficiency of steam engines (well, ideal gases in pistons), and to explain to teenage chemistry students why endothermic reactions may be favored.

    Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…
    I could ask my physicist and materials scientist buddies what they use, but I somehow doubt that would change anyone’s opinions…

    There is only one way to define entropy correctly: minus rho log(rho).

    For Sal.

  4. walto,

    Sadly, though, he’s making much the same error here. It’s not that the missing information theory is wrong–it’s quite right. The thermodynamic entropy is CORRECTLY expressible as an amount of missing information. But the information is missing because particular physical characteristics are or are not exemplified in the world.

    To say that the information is missing means that an observer lacks it. (The information isn’t missing from the world, after all.) Another observer may lack a different amount of information. A third observer may lack none at all.

    Once again, it’s not the knowing or failure to know that makes it so.

    Entropy is a measure of missing information: the gap between the information contained in the macrostate and the information required to pin down the exact microstate. The choice of experiments/measurements determines the macrostate, and the entropy is a function of the macrostate.

    Yolanda measures isotopic concentrations, while Xavier does not. They end up with different macrostates and therefore different entropies. The difference is not in the system — they’re observing the same one.

    The difference is in the macrostates, which are observer-dependent. Entropy is observer-dependent.

  5. keiths,
    I mostly agree with that post. I’d put two things a bit differently, though. And, as said, this is a tempest in a teapot.

    keiths: They end up with different macrostates….

    The difference is in the macrostates,

    I suppose it’s OK to put it that the macrostates are different as between the assessments of the two observers, but only if one remembers if one uses that sort of locution, “macrostates” are maps, not territories. I prefer to think of macrostates as states of the world, supervening on microstates, and the world doesn’t change from Xavier’s observation to Yolanda’s. The entropies do, however, because the the entropies they calculate are a function of the information each observer has and lacks. One has more, and so calculates a lower entropy. So I think it’s less misleading to think of the entropies as maps than as territories, because the world is as it is.

    Recognize both sides of this coin–the necessity of the physical properties and the delimitation of the relevant ones on one side, and that the calculation is a matter of measuring what is not known about the microstates from this or that perspective, and the this whole matter is little more than a quibble. Exciting for those who like them (or like to fight), but, IMO, likely to be of little interest either to philosophers, or to working scientists.

  6. Quoted by walto. Glad to respond because Walto brought it to my attention.

    DNA_jock:

    This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts.

    I took graduate level statistical mechanics and thermodynamics from John Hopkins University Whiting School of Engineering in addition to undergraduate thermo dynamics for mechanical engineers which I took as an elective as a student of electrical engineering.

    You’re welcome to remain in your world of self delusions about my educational attainment in these topics.

    I provided three different ways to calculate entropy from numerous examples.

    I could ask my physicist and materials scientist buddies

    I studied under physicists and material scientists at the Applied Physics Lab of JHU.

    Bloviate away DNA_jock. How about you provide for the readers examples of your skill in computing the entropy change of a melting ice cube using your knowledge of Boltzmann microstates or Shannon entropy to the exclusion of the Claussius definition that involves heat and temperature.

    Here is entropy measurements at the molecular level being done on protein ligand interactions. Do you know why we do things empirically with Claussius vs. Shannon? We have to confirm our theoretical models, and complex folded molecules like proteins are pretty hard to model and compute the number of Boltzmann microstates involved.

    A recent paper by Tamura and Privalov (TP) (1997) purports
    to make a direct measurement of the translational/rotational
    entropy contribution to the formation of a dimer of the
    Streptomyces subtilisin inhibitor (SSI). TP studied the temperature-
    induced unfolding of the wild-type dimer and of a mutant
    (D83C), with a disulfide cross-link between the subunits, by
    differential scanning calorimetry; the measurements were made
    as a function of concentration. When normalized to the same
    temperature, the entropy of unfolding of the native SSI dimer
    adjusted to a 1 M standard state was found to be approximately
    the same as that of the cross-linked mutant dimer; the measured
    difference was –5 6 4 cal/mol·K.

    http://peds.oxfordjournals.org/content/12/3/185.full.pdf

    Show for the readers how you do this measurement with your non-existent
    “missing information” meters or your Boltzmann microstate counters that you and Keiths swear by.

    DNA_jock:

    This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts.

    This is a good example why you’re on my ignore list. You’ll insist I don’t know something, you’ll delude yourself that I only know high school level statistical mechanics and thermodynamics because you refuse to take my word for it that I actually have more academic training than that. Wallow in your self delusions, but I’m not eager to spend time responding unless Walto or colewd request a response from me regarding your charge.

    The paper referenced above talked about Differential Scanning Calorimetry which was used to measure(infer) entropy change.

    https://en.wikipedia.org/wiki/Differential_scanning_calorimetry

    Differential scanning calorimetry or DSC is a thermoanalytical technique in which the difference in the amount of heat required to increase the temperature of a sample and reference is measured as a function of temperature. Both the sample and reference are maintained at nearly the same temperature throughout the experiment. Generally, the temperature program for a DSC analysis is designed such that the sample holder temperature increases linearly as a function of time. The reference sample should have a well-defined heat capacity over the range of temperatures to be scanned.

    In contrast, you’re welcome to appraise the readers of your non-existent “missing information” meters or microstate counters since you present yourself as having expert knowledge in these matters. Go ahead, DNA_jock, or how about you just buzz off this discussion since you’re not adding much to it.

  7. keiths: Entropy is a measure of missing information: the gap between the information contained in the macrostate and the information required to pin down the exact microstate.

    How do we measure the information contained in the macrostate?

  8. DNA_jock as quoted by Walto:

    dQ/T is rarely informative:

    Tell that to Dr. Mike Elzinga who is a physicist and professional in thermodynamics!

    That’s counting available microstates all the way; there’s no talk of dQ/T…

    So how do you count the microstates DNA_jock? LOL!

    How do you confirm you actually counted them right? Like, ahem using calorimiters and thermometers.

    DNA_jock as quoted by Walto:

    chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…

    Baloney! I just gave an example of analysis of ligand and protein binding and measuring entropy, it was done with calorimeters and thermometers, the old fashioned Claussis way, not using your non-existent microstate counters.

    How do you know what the physical degrees of freedom are for proteins. Have you solved all protein folding problems? LOL!

    https://www.ncbi.nlm.nih.gov/pubmed/9541869

    the protein folding problem belongs to a large set of problems that are believed to be computationally intractable

    You better bust out those non-existent “missing information” meters and those non-existent microstate counters, DNA_jock, because it may not be so easy to calculate the microstates of proteins from first principles.

    dQ/T is rarely informative:

    Tell that to Dr. Mike.

  9. stcordova: Have you solved all protein folding problems? LOL!

    Yeah DNA_Jock. Have you solved all protein folding problems? Have you? So there!

  10. Oh, Sal,
    You appear to have shot yourself in the foot again. Another indication that you took all those fancy classes, but still don’t seem to have learnt anything.
    In your attempt to refute my statement that

    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…

    you cite Karplus and Janin, 1999, quoting them thus:

    A recent paper by Tamura and Privalov (TP) (1997) purports
    to make a direct measurement of the translational/rotational
    entropy contribution to the formation of a dimer of the
    Streptomyces subtilisin inhibitor (SSI). TP studied the temperature-
    induced unfolding of the wild-type dimer and of a mutant
    (D83C), with a disulfide cross-link between the subunits, by
    differential scanning calorimetry; the measurements were made
    as a function of concentration. When normalized to the same
    temperature, the entropy of unfolding of the native SSI dimer
    adjusted to a 1 M standard state was found to be approximately
    the same as that of the cross-linked mutant dimer; the measured
    difference was –5 6 4 cal/mol·K. [actually, it should be 5 +/- 4 cal/mol.K, which is close to zero… -564 cal/mol·K. would be YUUUGE]

    Reading your quote, the word “purports” jumps out at me. What are the authors actually saying here? Well it’s only a one page commentary, so I encourage readers here to look for themselves…
    Here’s the paragraph preceding the one that Sal quote-xxxxx:

    An understanding of the origin of the entropy loss involved in association processes (protein–protein or protein–ligand) is important for an interpretation of the many equilibria that play a role in biology. Although the total entropy change on association can be determined by measuring the transition temperature and associated enthalpy change (Privalov et al ., 1995), interpretation of the results is hindered by the fact that there are several partially cancelling contributions [Tidor and Karplus, 1993 (TK)]. The loss of three translations and three (two for a linear molecule) rotations is entropically unfavorable. This can be offset by favorable entropic contributions, including those that arise from the hydrophobic effect, changes in the protonation states, solvent and counter ion release, as well as the presence of six (five for a linear molecule) new vibrational degrees of freedom. Because the translational and rotational contributions to the entropy of binding appear to be very simple to calculate, they have often been discussed. However, there are no direct measurements of their contribution to binding and theoretical estimates vary by more than an order of magnitude.

    It’s all about degrees of freedom, and nary a mention of dQ/T…
    Karplus and Janin go on to explain that Tamura and Privalov did the wrong experiment.
    Sal has, once again, conflated an experimental technique (Differential Scanning Calorimetry) that is used in an attempt to measure entropy changes (in this case, unsuccessfully) with the underlying understanding about what is actually going on: the freezing out of degrees of freedom, including the hydrophobic effect. You may have missed my mentioning the latter, what with the fingers in the ears an’ all.
    You are making my point for me. Yet again.

  11. stcordova: I took graduate level statistical mechanics and thermodynamics from John Hopkins University Whiting School of Engineering in addition to undergraduate thermo dynamics for mechanical engineers which I took as an elective as a student of electrical engineering.

    Oh, thermo courses for engineers.
    My father and brother got their engineering degrees from a university that, almost uniquely, requires their students to understand the equations they plug values into.
    Either JHU falls in the “just plug in the numbers” category, or Sal is not representative of their engineering graduates. Or both.

  12. From Labmert’s website.

    What is entropy good for?

    Of course, we have a start on a good answer to that question: it must have something to do with the motional energy (plus any phase change potential energy) in the substance at that temperature. However, we need one more fact about energy and entropy for a complete understanding. At absolute zero, 0 K, all substances are assumed to have zero entropy. Then, when a substance is warmed from 0 K, more and more motional energy is added to it as its temperature increases to the ‘standard entropy temp’ of 298.15 K. (From what we have already discussed, a substance such as hydrogen gas would require additional energy to change phases from a solid at 0 K to a liquid and then more to change from liquid to gas before the temperature reaches 298 K.) So, it is probable that a solid like carbon at 298 K (as a diamond) has a low standard state entropy (“So”) of 2 J/K mol whereas liquid water has a medium So of 70 J/K mol and gaseous hydrogen gas has an So of 131 J/K mol. Liquids and gases need more energy because they have had to be changed from a solid to a liquid and then to be a gas at 298.15 K and stay that way at that temperature.

    There. You have a pretty good idea of why the standard entropies are larger or smaller for various substances and for various states — solid or otherwise. The larger the number for entropy in the tables of standard entropy values, the more was the quantity of energy that had to be dispersed from the surroundings to that substance for it to exist and be stable at 298.15 K rather than 0 K!

    Then, it’s obvious why “liquid forms of substances have higher standard entropies than their solid form” — the solid had to be given additional energy (enthalpy) for fusion so the molecules could move more freely. The amount of energy in their ΔH/T of fusion/melting has to be added to the entropy of the solid substance. No problem. But why does pure carbon in the form of solid graphite have a higher So (6 J/K mol) than pure carbon in the form of solid diamond (2 J/K mol)? That means that graphite must need energy to move in some way that diamond can’t. Aha — diamond is totally rigid, each carbon atom tightly held in a tetrahedral framework. Graphite is totally different — its atoms are even more tightly held in layers, but those layers are like sheets of paper in a pile. The individual atoms can’t move readily, but the sheets of atoms can slide over one another a tiny bit without great difficulty. That requires some energy and therefore it means that graphite has to have more energy in it than diamond at 298.15 K.

    http://entropysite.oxy.edu/wiki_entropy.html

    Lambert could have gone into the complexities of microstates and degrees of freedom, but he gave a common sense explanation.

    And I highlight again this simple echoing of Claussius by Lambert:

    The larger the number for entropy in the tables of standard entropy values, the more was the quantity of energy that had to be dispersed from the surroundings to that substance for it to exist and be stable at 298.15 K rather than 0 K!

    Let DNA_jock, Keiths, and Mung explain entropy like this with “missing information”, let them explain the differences in entropy of graphite and diamonds with “missing information” in so clear and lucid a way as Lambert did.

  13. Sal,

    I took graduate level statistical mechanics and thermodynamics from John Hopkins University Whiting School of Engineering in addition to undergraduate thermo dynamics for mechanical engineers which I took as an elective as a student of electrical engineering.

    You’ve also described yourself as “a mediocre student of science and engineering at best”. Based on your performance in this thread, I think we can all agree on that.

    The question is, are you capable of learning from those who are not mediocre students?

  14. DNA_jock as quoted by Walto:

    Engineers who design heat exchangers care about rates of heat transfer, not the entropy.

    From wiki entry on heat exchangers:

    There are three goals that are normally considered in the optimal design of heat exchangers: (1) Minimizing the pressure drop (pumping power), (2) Maximizing the thermal performance and (3) Minimizing the entropy generation (thermodynamic). See for example:[17]

    https://en.wikipedia.org/wiki/Heat_exchanger#cite_note-17

    http://www.sciencedirect.com/science/article/pii/S0360544213008074

    Curved pipes constitute essential components in engineering systems such as heat exchangers, and process pipelines. In curved pipes with 180° bends, the excess pressure drop, and the excess entropy generation (through heat and fluid flow) bring forth serious penalties. To overcome these difficulties a new partially curved pipe will be examined in this paper. It consists of three straight pipe segments connected with two 90° bends. The pressure drop and entropy generation are determined numerically for several configurations of ‘partially curved’ pipes when the fluid flow is laminar, viscous and incompressible. It is shown that the new curved pipe is advantageous because the pressure drop and entropy generation are considerably reduced when implementing the optimum layout, compared to the standard case of a fully curved section with 180° bend. As an added value, it is shown that the new optimum partially curved pipes are almost independent of Reynolds and Prandtl numbers.

    Maybe I should undo my ignore button so I can see the latest ways DNA_jock embarrasses himself. Hahaha!

  15. walto,

    From that comment, it appears that you now concede that entropy is observer-dependent.

    Just to be clear, do you now accept that both Xavier and Yolanda correctly calculate the entropy, and that your earlier claim — that Yolanda’s entropy value is more accurate than Xavier’s — is incorrect?

  16. Mung,

    How do we measure the information contained in the macrostate?

    You don’t have to. You can go straight to the difference — the entropy.

    Take the “odds before evens” macrostate for the deck of five cards:

    Now suppose that instead of being randomly shuffled, the deck has been prepared this way:

    1) the odd cards have been separated from the evens;
    2) the two “subdecks” have been separately shuffled; and
    3) the “odd” subdeck has been placed atop the “even” subdeck.

    Let’s call this the “odds before evens” macrostate.

    For this macrostate, the number of possible microstates is no longer 120. Some of the sequences have been ruled out, such as this one: 1 4 5 3 2. It doesn’t satisfy the “odds before evens” criterion. There are only 12 sequences that do.

    In other words, the macrostate “odds before evens” corresponds to an ensemble of 12 microstates.

    There are only 12 microstates that are compatible with the macrostate. Each of them is equally probable. The entropy is therefore log(12).

    That is, the entropy is the difference between knowing the exact microstate and knowing that it is one of 12 equally probable possibilities.

  17. stcordova,

    Wow, just wow!
    The statement you highlight

    The larger the number for entropy in the tables of standard entropy values, the more was the quantity of energy that had to be dispersed from the surroundings to that substance for it to exist and be stable at 298.15 K rather than 0 K!

    is wrong. Whether it is stable or not is not the point; a (reasonably stable) state is chosen by convention and must be specified. His use of “dispersed” is rather gratuitous, too. Energy “absorbed” would do just fine. Is he trying to argue in favor of a particular view, ya think?
    Then the howler:

    Graphite is totally different — its atoms are even more tightly held in layers, but those layers are like sheets of paper in a pile. The individual atoms can’t move readily, but the sheets of atoms can slide over one another a tiny bit without great difficulty. That requires some energy and therefore it means that graphite has to have more energy in it than diamond at 298.15 K.

    No it doesn’t “require some energy”. Just think man! The mere fact that the atoms CAN slide over each other cannot require more energy. Only if they DO so move is any energy involved. It’s an additional degree of freedom, and therefore, there are more microstates, and there’s higher entropy in graphite than diamond.
    Yikes.

    Regarding heat exchangers, I’m guessing that Hajmohammadi et al. added that bloviation about entropy in order to get their goofy paper published in the Journal “Energy”. Color me underwhelmed by the result that a 180° bend causes a greater pressure drop than separated 90° bends. It’s the pressure drop that matters.
    I mean the paper begins:

    Curved pipes constitute essential components in engineering systems such as heat exchangers, and process pipelines.

    Say what? Not since 1923, they aren’t.

  18. DNA_Jock: My father and brother got their engineering degrees from a university that, almost uniquely, requires their students to understand the equations they plug values into.

    Yeah. Someone could make an Excel spreadsheet that pumps out the results of calculations. Oh. Wait.

  19. stcordova: Maybe I should undo my ignore button so I can see the latest ways DNA_jock embarrasses himself. Hahaha!

    My suggestion is that you undo your Ignore button and just try to have a normal discussion.

  20. keiths: Entropy is a measure of missing information: the gap between the information contained in the macrostate and the information required to pin down the exact microstate.

    Mung: How do we measure the information contained in the macrostate?

    keiths: You don’t have to. You can go straight to the difference — the entropy.

    Entropy is a measure of the information gap. That is what you wrote. If I don’t know where I started, and where I ended up, then how do I know what the difference was?

    Information contained in the macrostate.
    Information contained in the microstate.
    Measure the difference
    Call it Entropy.

    How do you decide the amount of information contained in the macrostate? If you can’t say, I’ll understand completely. All along in this thread I think you speak too loosely and fail to make important distinctions.

    Let me ask a little differently. How do you quantify “the amount of information” present in the macrostate? That quantity would seem to me to be very important to any who thinks entropy is qualitative.

  21. stcordova: The larger the number for entropy in the tables of standard entropy values, the more was the quantity of energy that had to be dispersed from the surroundings to that substance for it to exist and be stable at 298.15 K rather than 0 K!

    So all substances are energy dispersal sinks, else they would not exist in any stable form? I readily confess my immeasureable ignorance here.

    So in the “energy dispersal” view energy is dispersed from the environment (whatever that means) and into substances? How does that work for an isolated system?

  22. : What Does a Partition function Tell You?

    The partition function is the connection between macroscopic thermodynamic properties and microscopic models. It is a sum of Boltzmann factors that specify how particles are partitioned throughout accessible states.

    – Molecular Driving Forces, p. 176

  23. Entropy is of great use, and in fact is essential, in engineering and chemical problems. However, what entropy “is,” and what are the limits of its application, are not explained at all by the classical theory. English-speaking physicists, in particular, have had a very difficult time understanding it, though they have been able to apply it with great power and ingenuity. Engineers have been very foggy on entropy, and to all appearances remain so, while engineering students have learned to just follow the rules and think of more pleasant things.

    https://mysite.du.edu/~jcalvert/phys/boltz.htm

    LoL.

  24. The Boltzmann distribution says that more particles will have low energies and fewer particles will have high energies. Why? Particles don’t have an intrinsic preference for lower energy levels. Fundamentally, all energy levels are equivalent. Rather, there are more arrangements of the system that way.

    Molecular Driving Forces, p. 173

  25. Thermodynamic logic often seems complex. This apparent complexity arises because the fundamental quantities that predict equilibria are not directly measurable. Equilibria are governed by energy and entropy through the First and Second Laws, but unfortunately there are no ‘meters’ that measure energy or entropy. Instead, inferences about equilibria are indirect and drawn from observations of quantities that can be measured, such as temperature, pressure, work, heat capacities, concentrations, or electrical potentials. thermodynamics is a business of making clever inferences about unmeasurable quantities from observable ones, by various means.

    – Molecular Driving Forces, p. 109

    Sorry Sal.

  26. Moved a comment to Guano. Please address the ideas and not perceived proclivities of particular participants.

    ETA: More alliteration.

  27. It is very difficult to get an exact calculation of microstates, because most mathematical models are idealizations to make them tractable. Sometimes if the system is simple, we can get a good approximation and calculate microstates, but the empirical measurement has the final say about the accuracy of the model, that’s why this is a silly statement:

    DNA_jock as quoted by Walto:

    And once we start talking about scientists, then dQ/T is rarely informative

    That’s counting available microstates all the way; there’s no talk of dQ/T…

    If we knew specific heats exactly via computation we can compute microstates to the Nth decimal, but we can’t, our math models are idealizations. Where are DNA_jock’s microstate counting machines?

    Hence one cannot count microstates “all the way” as DNA_jock asserted. The only way I counted microstates for the case of a melting ice cube was to first compute dQ/T, so I really didn’t count them did I?

    DNA_jock has yet to show he can actually count the microstates of a simple melting ice cube accurately without dQ/T!

    When double Nobel prize winner Pauling computed the entropy of ice (aka counted microstates), it wasn’t counting all the way. Pauling was pretty accurate, but not exact. Even Pauling had to reference the experimentally measured values to confirm his count of microstates was reasonably accurate!

    http://www.uni-leipzig.de/~biophy09/Biophysik-Vorlesung_2009-2010_DATA/QUELLEN/LIT/A/B/3/Pauling_1935_structure_entropy_ice.pdf

    It is suggested that ice consists of water molecules arranged so that each is surrounded by four others, each molecule being oriented in such a way as to direct its two hydrogen atoms toward two of the four neighbors, forming hydrogen bonds. The orientations are further restricted by the requirement that only one hydrogen atom lie near each O-O- axis. There are (3/2)^N such configurations for N molecules, leading to a residual entropy of R ln (3/2) = 0.805 E. U. in good agreement with the experimental value of 0.87 E.U

    Linus Pauling
    September 24, 1935

    Of course we know more now than back in 1935, but it is not so easy to actually count microstate! We use the empirical data to see how accurate our mathematical idealizations or computational numerical models are. That’s why this is a ridiculous statement:

    DNA_jock as quoted by Walto:

    And once we start talking about scientists, then dQ/T is rarely informative

    Tell that to Dr. Mike, tell that to Linus Pauling.

    The empirical measurements of entropy supported the hypothesized molecular models of ice structure, not the other way around. We sometimes don’t have an accurate model of the molecular structure first that allows us to count microstates, we have to make measurements of entropy and then that informs us if our math model is correct or not. We compute the count of microstates from the math model, and then we compare it to the empirical measurements of entropy to see if we counted correctly and hence had the right model. DNA_jock’s comment is a mess.

    DNA_jock as quoted by Walto:

    And once we start talking about scientists, then dQ/T is rarely informative

    ROTFL! How do you think Linus Pauling concluded the 4-hydrogen-bond model of ice was likely correct? Er, he referenced the actual measurments of internal energy (from which we can compute entropy).

    If we could actually count microstates exactly or specific heats, what’s the need for Differential Scanning Calorimeters used by chemists and physicists and material scientists, etc.

    In addition to differential scanning calorimetry we have other techniques to measure dQ/T.

    Isothermal microcalorimetry
    Isothermal titration calorimetry
    Dynamic mechanical analysis
    Thermomechanical analysis
    Thermogravimetric analysis
    Differential thermal analysis
    Dielectric thermal analysis

    Take a relatively simple phase state, gases. We have idealizations of mono-atomic, di-atomic, poly-atomic. From specific heats we can compute entropy, but note, the specific heats derived by idealized math models of the ideal gases do not agree exactly (albeit pretty closely) with empirical measurements.

    See for yourself that there is a slight difference between the theoretical specific heat (and hence entropy) and the actual specific heat. The Sakur-Tetrode equation is only an approximation and will definitely fail when the gases cool and liquefy or solidify.

    You can see the estimates of the specific heat (and thus the resulting entropy figures) aren’t exact, not too bad for mono and diatomic gases, but starting to get awful for poly atomic gases. Things will generally get worse for the solid state, and nightmarish for the liquid state of substances.

    Bottom line, DNA_jock’s comment is stupid. There is a reason we have Differential Scanning Calorimeters and other fancy ways to measure heat and temperature, because we don’t count microstates all the way.

  28. The following paper shows no mention of microstates, despite this, DNA_jock says entropy is deduced by professionals by “counting available microstates all the way; there’s no talk of dQ/T” .

    I’ve pointed out thermometers and calorimeters are the ways entropy is measured by chemists. Entropy is not measured by non-existent microstate counters or inaccurate microstate counting via idealized (and therefore inaccurate) math/physics models that DNA_jock swears by. Neither is it measured by “missing information” meters that Keiths needs to swear by.

    Chemists often measure entropy the old fashioned way (energy and temperature) and use

    ds = dQ/T

    or some variant thereof like:

    TΔS=ΔH−ΔG

    which is rooted in the Claussius definition of entropy

    ds = dQ/T

    as shown in the derivation here eq. 1:

    https://en.wikipedia.org/wiki/Gibbs_free_energy

    In any case, here is a paper by professional chemists that implicitly uses the definition of entropy in terms of Claussius dQ/T.

    Thermodynamics of the hydrophobic effect. III. Condensation and aggregation of alkanes, alcohols, and alkylamines

    ….
    Experimental data are available for three thermodynamic parameters (ΔG, ΔH, and ΔCp) for all three processes, and the entropy can be calculated from TΔS=ΔH−ΔG
    ….
    Calorimetric enthalpies of aggregation reactions are usually determined by measuring the heat evolved upon dissolving liquid or solid compound and then extrapolating to zero concentration. Before the early calorimetric determinations [15] and [16] nearly all heats of solution were determined by van’t Hoff analysis of solubility dependence on temperature. However, there are no measurements of the reverse reaction, i.e. starting with the compound that is fully dissolved in water and ending with the aggregate. As explained above, we carried out such reaction calorimetrically beginning with soluble alkylammonium cations and ending with insoluble alkylamines, and directly determined the enthalpy of aggregation of long alkylamines. There are several advantages of using titration calorimetry over van’t Hoff analysis.
    http://www.sciencedirect.com/science/article/pii/S0301462201002095

    despite these facts

    DNA_jock said (as quoted by Walto):

    And once we start talking about scientists, then dQ/T is rarely informative

    To embarrass DNA_jock, there are more examples like the above paper from biophysics, biochemical, and chemical literature.

    there’s no talk of dQ/T

    Uh, DNA_jock, maybe I need to connect the dots for you. The above paper mentions T for temperature as well as heat changes — dQ is implicitly inferred from the change in gibbs free energy and change in enthalpy using calorimeters, etc.). That means the paper has data for dQ and T. It doesn’t measure entropy by “counting microstates all the way”.

    This dQ and T data can be plugged into the equation of Gibbs free energy and derive the entropy.

    dS = dQ/T is embedded in the formulas (like TΔS=ΔH−ΔG) of the paper cited, but apparently you need to have this spoon fed to you since you make idiotic comments like this:

    dQ/T is rarely informative

    — DNA_jock

    ROTFL!

    On the contrary, dQ/T gives us insights into how the molecules may move and connect since many times we can’t see them directly. We build models of how we think they connect and move, and these models give us counts of microstates. If the entropy computed from the microstates of the model match up with the measured entropy using dQ/T, then our confidence in the structural details of the molecular system is supported.

    In any case, this has to be a classic dumb statement by DNA_jock:

    dQ/T is rarely informative

    — DNA_jock

    Hahaha! I’m going to put that one in my trophy case.

  29. stcordova: The following paper shows no mention of microstates, despite this, DNA_jock says entropy is deduced by professionals by “counting available microstates all the way; there’s no talk of dQ/T” .

    Oh goody. A quote mine.

    I’ll see yours and raise you one of my own:

    DNA_Jock:

    It’s all about degrees of freedom, and nary a mention of dQ/T…

  30. Sal’s strategy seems to be…

    1) to avoid addressing the reasons why entropy cannot be a measure of energy dispersal by…

    2) putting people (or pretending to put them) on “ignore”…

    3) while trying to pass off the Clausius quantity dQ/T as “energy dispersal”, when it clearly isn’t, and…

    4) overlooking the fact that the Clausius equation simply doesn’t work in many cases, such as the gas-mixing cases we’ve been discussing throughout the thread.

    Not very impressive. Where did he get the idea that he was competent to explain this stuff to his fellow creationists and IDers?

  31. stcordova: I’ve pointed out thermometers and calorimeters are the ways entropy is measured by chemists.

    And yet:

    Mung: …but unfortunately there are no ‘meters’ that measure energy or entropy.

    Molecular Driving Forces p. 109

  32. keiths: Where did he get the idea that he was competent to explain this stuff to his fellow creationists and IDers?

    Better yet, to correct them. To insist that they are wrong and he is right.

    But, he does have a spreadsheet. Do you have a spreadsheet?

  33. stcordova: On the contrary, dQ/T gives us insights into how the molecules may move and connect since many times we can’t see them directly. We build models of how we think they connect and move, and these models give us counts of microstates. If the entropy computed from the microstates of the model match up with the measured entropy using dQ/T, then our confidence in the structural details of the molecular system is supported.

    Yup.
    You keep making my point for me. Your continued failure to recognize the difference between a measurement technique and our understanding of the underlying system is noted.
    If only you took your fingers out of your ears long enough to read my posts, you might, just might, stop shooting yourself in the foot.

    A point of interest for colewd: Notice how Lambert defines the entropy of a “substance” to be zero at 0K,

    At absolute zero, 0 K, all substances are assumed to have zero entropy.

    which carefully avoids the problem (for his energy dispersal approach) that the entropy of a mixture at 0K is not zero.
    Hope this helps.

  34. It’s also worth reiterating that Sal’s own example — of a single helium molecule in a container — shows that entropy is not a measure of energy dispersal.

    The energy of the system is concentrated in a single molecule. Change the size of the container and the entropy has changed. What about energy dispersal? Exactly the same as before; all the energy is concentrated in one molecule.

    Entropy changed, but energy dispersal did not. Entropy is not a measure of energy dispersal.

  35. Unlike keiths, I don’t think it makes any sense to speak of the change in entropy of a thermodynamic system composed of a single molecule. But that is what happens when you conflate the Shannon measure of information (SMI) with thermodynamic entropy.

  36. Mung,

    Unlike keiths, I don’t think it makes any sense to speak of the change in entropy of a thermodynamic system composed of a single molecule.

    Why not?

    If one molecule isn’t enough, how many are required, and why?

  37. From the DNA_Jock school of thermodynamics:

    dQ/T is rarely informative

    — DNA_Jock

    🙂 🙂 🙂 🙂 🙂

  38. keiths:
    Mung,

    Why not?

    If one molecule isn’t enough, how many are required, and why?

    Statistical Thermodynamics, by Juraj Fedor (2013) says that one molecule IS enough:

    One of the keywords in statistical thermodynamics is energy that a molecule possesses. Imagine a single molecule. It can ‘carry’ energy via its translational motion, vibrations of individual atoms and the rotation of [the] molecule. Plus, the electrons within individual atoms also have certain energy. The energy of the molecule is sum of these four contributions. One would expect, that this sum can be any number. Surprisingly, it turns out that this is not true – quantum mechanics says that there are only certain energies possible.

    But there is more than one that is possible. Since according to Feder, entropy is the number of microstates having the same specified amount of energy, it makes sense to talk about the entropy of a single molecule.

    I note that Feder also says that

    There are [a] few exotic ways…one can interpret the entropy:

    * [A] larger entropy means that there are more possible microstates in which the system can be. That means with the larger entropy we have less detailed information about the system – we know only the macrostate (the total internal energy). Often, entropy is interpreted as a measure of our detailed information about the system (larger entropy = less detailed information).

    * Also, a more organized system has always lower entropy than a less organized system. Imagine the same amount of atoms in crystal and in liquid phase. There are certainly more possible con figurations (and microstates) that will result in the same energy in the liquid phase. Thus, entropy can be viewed as a measure of the system’s disorder.

    These interpretations are useful in information theory where entropy is widely used. For us the most useful ‘interpretation’ is – entropy is k times logarithm of number of microstates that will result in a given internal energy.

  39. Walto:

    it makes sense to talk about the entropy of a single molecule.

    Yes, absolutely. You are a genius, Walto!

    2nd Law of Thermodynamics — an argument Creationists and ID Proponents should NOT use

    Sal, you are tilting at windmills.

    Forget about gases, take a single particle, in the classical approximation. “Counting its microstates” means counting the volume of phase space available to it. The phase space of a single particle is six-dimensional: it includes three coordinates and three momenta. So counting microstates is done with consideration of the particle’s physical location. By definition. There is no point in pinning Mike on this. This is one of the basic points people learn at the beginning of a stat mech course.

    One can argue whether one should include microstates corresponding to a fixed energy (microcanonical ensemble) or in contact with a thermal reservoir (canonical), but this would be beside the point.

  40. stcordova: I took graduate level statistical mechanics and thermodynamics from John Hopkins University Whiting School of Engineering in addition to undergraduate thermo dynamics for mechanical engineers which I took as an elective as a student of electrical engineering.

    What were your textbooks?

  41. walto: …it makes sense to talk about the entropy of a single molecule.

    But keiths isn’t talking about the entropy of the molecule.

    keiths: Change the size of the container and the entropy has changed.

    The entropy of what?

  42. FWIW,

    I discussed single particle entropy here:

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    As an addendum, it is harder to establish dQ/T for single particle entropy unless one appeals to AVERAGE behavior over many trials, one thought experiment is too small a sample size, one needs many hypothetical thought experiments to get the law of large numbers to work for the Claussius definition.

    Single particle entorpy works nicely however for the Boltzmann definition.

    You can also plug single particles into the Sakur-Tetrode qunatum mechanical version of the Boltzman equation for idealized monoatomic gases.

  43. Mike Elzinga: The creationist tosses out a load of bullshit. The scientist refutes it and tries to explain the real science. The creationist ups the ante by jacking up to advanced topics. He throws out phony “disputes” among scientists. He tosses out equations and concepts that he doesn’t understand and then babbles gibberish about it.

    LoL.

  44. Mung: But keiths isn’t talking about the entropy of the molecule.

    keiths: Change the size of the container and the entropy has changed.

    The entropy of what?

    Oh OK. Should have stayed out of it. Sorry.

  45. walto: Oh OK. Should have stayed out of it. Sorry.

    No, you gave an interesting quote, and Sal chimed in which led to even more interesting quotes. It’s all good. 🙂

Leave a Reply