In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths,

    I infer from this post that you don’t understand what the word ‘rationalization’ means. I guess it goes in the pile with ‘macrostate’ ‘know’ ‘possible’ ‘truth’ and ‘honesty.’

  2. Hi keiths,

    Why can’t 1-1 and 6-6 be ‘microstates’, along with 3-4 and 4-3?

    What if we had a single die with 36 faces, one of which had the number two on it and one of which had the number 12 on it with 6 faces having he number 7 on them. Why can’t each face of such a die be a ‘microstate’?

  3. Mung: If Cp is a constant what is delta Cp?

    Ah, I love the Socratic method. I have been known to use it myself.
    Kirchhoff’s Law (which is a specific case of Hess’s Law, which is a restatement of the First Law of Thermodyanmics) states
    ∂ΔH/∂T = ΔCp

    where ΔH is the difference in enthalpy between the reactants and products (or, equivalently, the two phases for a phase transition)
    ΔCp is the difference in specific heat Capacities between the reactants and the products.
    and T is the Temperature (look, it even starts with “T”)

    Now, Cp varies with temperature. (Look it up, Sal.)
    ΔCp may or may not vary significantly with temperature (I gave walto a hypothetical example where Cp for the reactants and products both vary with temperature, but their difference does not).
    Idiots seeking to apply Clausius might assume that ΔCp does not vary with temperature and, under this Clausian assumption, ΔH will vary linearly with T.
    Alternatively, ISTAC might get a mite more sophisticated, and do the Kirchhoff integral ΔCp dT. This calculation acknowledges that ΔS may vary with temperature.
    Or ISTAC’s could concede that the theory is too tough for them to fathom, and be reduced to measuring dQ/T.
    Three options; three different results. The question is, WHY?

    Poor Sal Still thinks that Cp is a constant implies ΔCp = zero.
    Both the premise and the logic are wrong. It’s a twofer!

    P.S. For the sardony-impaired, I am laughing with Mung…

  4. Mung,

    Why can’t 1-1 and 6-6 be ‘microstates’, along with 3-4 and 4-3?

    (1,1) and (6,6) are microstates, just as (3,4) and (4,3) are. Each has probability 1/36 when the dice are thrown fairly and nothing else is known about the result.

    What if we had a single die with 36 faces, one of which had the number two on it and one of which had the number 12 on it with 6 faces having he number 7 on them. Why can’t each face of such a die be a ‘microstate’?

    They could be microstates.

    What’s tripping you up is that once you specify your microstates, you have to stick with them. If you adopt one set of microstates — (r,g) in our case, where r and g can both range from 1 to 6 — but then use an epistemic probability distribution that isn’t a distribution over those microstates, you’ll get the wrong answer.

    We agreed to designate the microstates in (r,g) style. Given that choice, there are 36 distinct equiprobable microstates when the two fair dice are thrown randomly and nothing else is known about the result. The entropy is therefore

    S = log2(36) ≈ 5.17 bits

  5. Note:

    ΔH = m_constant T + b

    which if we adjust b it is

    H = m_constant T + b

    DNA_Jock used sloppy prose by saying:

    ΔH varies linearly with T

    strictly speaking, that makes little sense either, this is better

    ΔH varies linearly with ΔT

    And that is more in line with the literature, like that which I cited.

    In any case, even without that improvement, option A is gobbledy gook.

    That’s an example why it’s next to worthless to try to follow DNA_Jock’s leading questions and confused statements.

    But if ΔH varies linearly with ΔT

    ΔH = m_constant ΔT + b_delta

    then

    H =m_constant T + b

    Cp = ∂H/∂T = m_constant

    which refutes DNA_jock’s point that Cp isn’t constant for option A, and also leads to the conclusion where I said

    dS/dT = Cp/T

    means entropy change in option A is DEPENDENT on temperature contrary to DNA_Jock saying entropy change was “temperature-independent.”

    Again Option A and Option B weren’t in general the same thermodynamic system. Option A has the characteristics of a non-chemical reaction where Option B did since it invoked Kirchoff’s thermochemistry.

    This wasn’t about getting different answers for entropy for the same system, it’s getting different answers for entropy for different systems. As in, “that’s what we’d expect Captain Obvious.”

  6. Sal keeps digging:

    DNA_Jock used sloppy prose by saying:

    ΔH varies linearly with T

    strictly speaking, that makes little sense either, this is better

    ΔH varies linearly with ΔT

    And that is more in line with the literature, like that which I cited.

    Whaaaat?
    What would ΔT be? The difference in temperature between the reactants and products?
    Sal, you aren’t making any sense whatsoever.
    This is basic math:
    ∂y/∂x = m
    IFF m is a constant, then we can say “y varies linearly with x”
    So, Kirchhoff’s Law states
    ∂ΔH/∂T = ΔCp
    hence
    IFF ΔCp (not Cp) is a constant, then we can say “ΔH (not H) varies linearly with T (not ΔT)”
    You still have not wrapped your brain around what the symbol Δ means: Δ denotes the difference (in free energy ΔG, enthalpy ΔH, or entropy ΔS) between the reactants and products. As in ΔG = ΔH – TΔS.
    High School chemistry.

  7. I noted earlier this relation:

    dS/dT = Cp/T

    Assume Cp is non-constant, if that is the case, in general

    Cp/T will not be a constant either!

    Thus in general

    dS/dT = not constant, contrary to this absurd Option A:

    A) assume the entropy change is temperature-independent

    Pathetic. DNA_Jock’s first phrase of his otherwise incoherent attempt at an inquisition, makes little sense.

    dS/dT = Cp/T

    Tell the reader, DNA_Jock, if Cp is not a constant, then how likely you think Cp/T will be a constant and thus keep entropy change independent of temperature?

    DNA_Jock is wrong in general whether or not Cp is constant since in general Cp/T is not a constant, ergo dS/dT is not a constant which means entropy change is not temperature independent in general, contrary to DNA_Jock’s hypothetical absurd scenario:

    A) assume the entropy change is temperature-independent

    dS = entropy change

    dT = temperature change

    dS/dT = change in entropy with respect to change in temperature = not constant in general

    Not a surprise coming from the DNA_Jock school of thermodynamics that teaches:

    dQ/T is rarely informative

  8. What would ΔT be? The difference in temperature between the reactants and products?

    Not if there are no reactants and products, and those were the examples I was talking about. Reactants and products had nothing to do with the examples I discussed.

    Also dS/dT is in general not constant.

    A) assume the entropy change is temperature-independent

    Entropy change is not temperature-independent in general. So I’m not going to assume it, neither should you.

  9. What would ΔT be? The difference in temperature between the reactants and products?

    I wasn’t talking about heat solely from chemical reaction. It won’t be so nice if substantial heat is crossing the system boundary that wasn’t part of the chemical reaction after the reaction takes place. The entropy change in that case won’t be independent of temperature in general.

    You’re equivocating my usage of ΔT.

    But, with all the discussion of heat and temperature, you’re implicitly defining entropy change in terms of dQ/T which is ironic given you said:

    dQ/T is rarely informative

  10. DNA_jock:

    You still have not wrapped your brain around what the symbol Δ means: Δ denotes the difference (in free energy ΔG, enthalpy ΔH, or entropy ΔS) between the reactants and products. As in ΔG = ΔH – TΔS.
    High School chemistry.

    False, I mentioned the Gibbs free energy several pages earlier, and the changes in Gibbs free energy as well as changes in Entropy and Enthalpy. I even posted

    TΔS = ΔH – ΔG

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    as I was demonstrating the idiocy of your claim:

    dQ/T is rarely informative

    — DNA_jock

  11. colewd, walto,
    Let me know if there is anything you would like clarification on.
    Anything, that is, except for what Sal means by ΔT — you’ll have to ask him that.
    😉
    Cheers.

  12. except for what Sal means by ΔT

    I was talking about a melting ice cube and the enthalpy of fusion. At melting temperature ΔT = 0.

    DNA_Jock says “cool calculation” then interjects the enthalpies associated with a chemical reaction, which doesn’t apply to ice since ice doesn’t undergo a chemical change at 273.15K nor does it undergo a chemical change at 250K, and ice doesn’t have an entropy change that is independent of temperature and it is pointless to go into Kirchoff’s laws when I was posing an entropy change to Keihs about melting ice.

    Not surprising that DNA_Jock wants to treat the melting of ice like a chemical reaction, nor the heating of ice at 250K like a chemical reaction since he’s the same guy who says:

    dQ/T is rarely informative

    And then when I demand DNA_jock solve the simple question of the entropy change of melting ice without resorting to dQ/T but instead with Keiths “missing information” approach, he tries to describe melting of ice in terms of “products” and “reactants”. Pointless and it still didn’t answer the question posed about computing the entropy change in an ice cube melting and the method of computing missing information without first resorting to equations like dQ/T that have no regard for the missing information concept.

  13. stcordova: Entropy change is not temperature-independent in general. So I’m not going to assume it, neither should you.

    Boy oh boy. You can lead a horse to water … Or so they say.

    What if there are exceptions to what is “generally” the case?

  14. The present set of comments dealt with the simple question of how someone will calculate the entropy change of a 20 gram melting ice cube. After I went to some length to do the calculation in terms of the enthalpy of fusion, I got the answer to be:

    24.42 J/K

    DNA_Jock then weighed in and tried to frame the melting of ice in terms of a chemical reaction. Ugh!

    Ice when it melts to water is H20 before, H20 during, and H20 after the process of melting. It’s absurd to model this process at 250K (as DNA_Jock tried to do) when the melting point of ice is 273.15K!

    I gave him too much credit when I humored his derailments thinking it would actually lead somewhere. The result is he made absurd statements regarding the change in enthalpy of ice as it was heated:

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Which is absurd to apply to melting ice that is H20 before, during and after the melting process since DNA_Jock was talking about chemical reactions that involved reactants and products and didn’t even mention that’s what he was talking about for option A. No wonder it seemed ridiculous because it was ridiculous.

    Bottom line: neither he, Mung, nor Keiths have shown they can calculate change in entropy of a melting 20 gram ice cube using their methods that dismiss dS = dQ/T of Claussius.

    DNA_Jock certainly won’t because ice melting to liquid water is H20 before, during and after the process of melting and shouldn’t be characterized as a chemical reaction that involves change in enthalpies ΔH associated with a chemical reaction.

    The ΔH in question is dispersal of energy from the surroundings to melt the ice, and that is different from the ΔH of a non-existent chemical reaction involving reactants and products and that are used as inputs to Kirchoff’s laws. I gave too much credit to DNA_Jock that he would do something so pointless and confused. But that should not have surprised me because he’s the same guy who said:

    [the Claussius version of entropy in terms of] dQ/T is rarely informative

    — DNA_Jock

    Now, just to be thorough, I googled to see if any scientists of any reputation ever said regarding thermodynamics, “dQ/T is rarely informative “.

    Guess what, the only real hit was at TSZ by DNA_Jock! Hahaha. He’ll have to cite his own quotes from now on, because he’s about the only person on the planet who says stuff like that.

  15. stcordova: The ΔH in question is dispersal of energy from the surroundings to melt the ice, and that is different from the ΔH of a non-existent chemical reaction involving reactants and products and that are used as inputs to Kirchoff’s laws.

    Oh dear, Sal doesn’t understand the First Law of Thermondynamics.

  16. keiths: What’s tripping you up is that once you specify your microstates, you have to stick with them.

    I can’t change my mind?

    [2,3,4,5,6,7,8,9,10,11,12] where they are NOT all equally probable. Those symbols represent the available sums for a pair of dice. If those are the symbols emitting from the message source why can’t I calculate the Shannon entropy of that source? What is it that is illegitimate of my use of a non-uniform distribution and my way of calculating the entropy of the source?

  17. Mung,

    I explained it already:

    If you adopt one set of microstates — (r,g) in our case, where r and g can both range from 1 to 6 — but then use an epistemic probability distribution that isn’t a distribution over those microstates, you’ll get the wrong answer.

    We agreed to designate the microstates in (r,g) style. Given that choice, there are 36 distinct equiprobable microstates when the two fair dice are thrown randomly and nothing else is known about the result. The entropy is therefore

    S = log2(36) ≈ 5.17 bits

    You used the wrong probability distribution because you were cribbing from a source that you didn’t understand. You saw the words “entropy” and “pair of dice” and assumed that they were trying to solve the same problem as you. They weren’t.

  18. keiths: You used the wrong probability distribution because you were cribbing from a source that you didn’t understand. You saw the words “entropy” and “pair of dice” and assumed that they were trying to solve the same problem as you. They weren’t.

    I changed the problem I was trying to solve. As a result, I chose a different probability distribution. I calculated the entropy defined for that distribution.

    I think I’ve told you repeatedly that I changed my mind about my initial answer.

    So given the new distribution I chose for the new problem I chose, why is my calculation of the entropy wrong?

    Let me put this another way. You’re trying to get a point across about missing information. Right? Why can’t you make your point using the distribution I chose? Does your reasoning only work for a discrete uniform distribution? Because it also ought to work for a non-uniform distribution. Don’t you think?

  19. DNA_Jock

    Oh dear, Sal doesn’t understand the First Law of Thermondynamics.

    False.

    Your graph deals with supercooled liquid water FREEZING to ice, not ice MELTING to water. See:
    http://www.phy.mtu.edu/~cantrell/agu2009-PosterSzedlakV5.pdf

    Supercooled liquid water is common in Earth’s atmosphere and if/when it
    freezes

    You need to learn some basics, this is like elementary school bud. Melting isn’t the same thing as freezing. Hope you’ve figured that out by now.

    🙂 🙂 🙂 🙂

    The subject in question was MELTING ice, not freezing water (supercooled or otherwise).

    In the context of the discussion I was talking (and so were the chem exam and lab questions) talking about normal (not superheated) ice since the enthalpy of fusion was stated at around 333 -334 J/g, and the most precise figure I found was:

    333.55 J/g

    Hey, DNA_jock, normal ice doesn’t melt at 250K if you haven’t figured that out yet. No point trying to answer the question of melting of ice at 250K since the melting point is 273.15K.

    Even though there might be freezing of supercooled water at 250K, there isn’t a corresponding melting situation at 250K.

    See:

    http://amrita.olabs.edu.in/?sub=73&brch=2&sim=30&cnt=1

    HAHAHA!

    Do us a favor DNA_Jock, learn the difference between melting and freezing before you post another comment.

    I knew something was idiotic about your comment, not you’ve confirmed it on many levels. But that shouldn’t surprise me.

    You said:

    dQ/T is rarely informative

  20. stcordova: Do us a favor DNA_Jock, learn the difference between melting and freezing before you post another comment.

    Show of hands. Who here besides Salvador thinks DNA_Jock doesn’t know the difference between melting and freezing?

  21. Well, I’ll admit, Sal is right about one thing. Just the one. Ice won’t melt at 250K; it will, however, melt at 251.165K. Hey, I rounded a bit. As for the rest, he really is the gift that keeps on giving — let’s review the tape:

    I was trying to walk Sal through the implications (on how ΔH will vary with temperature) of Kirchhoff’s Law (viz: ∂ΔH/∂T = ΔCp). Sal seemed to be having difficulty with the meaning of the symbol Δ. He finally looked it up, and discovered that it’s called “Kirchhoff’s Law of Thermochemistry“, decided that it didn’t apply to a phase change, and crowed

    The ΔH in question is dispersal of energy from the surroundings to melt the ice, and that is different from the ΔH of a non-existent chemical reaction involving reactants and products and that are used as inputs to Kirchoff’s laws.

    I found Sal’s claim that there were two different types of ΔH and that therefore Kirchhoff’s Law did not apply to phase changes a particularly delightful demonstration that he didn’t understand Hess’s Law / the First Law of Thermodynamics (1LoT), so I posted an image from Szedlak where they apply Kirchhoff to the freezing of ice, and I made the comment:

    Oh dear, Sal doesn’t understand the First Law of Thermondynamics.

    So what does Sal do in response? He demonstrates that he doesn’t understand the 1LoT by attempting to draw a distinction between the ΔH of melting and the ΔH of freezing!!
    Sal, the ΔH of freezing is identical to the ΔH of melting, just with the opposite sign. If you are studying one, you are studying the other. This is really, really basic 1LoT stuff.

  22. DNA_Jock: Well, I’ll admit, Sal is right about one thing. Just the one. Ice won’t melt at 250K; it will, however, melt at 251.165K.

    Stupid question. Does it depend on the altitude?

  23. DNA_jock:
    Just the one. Ice won’t melt at 250K; it will, however, melt at 251.165K.

    You’re equivocating what I was talking about, and even equivocating what you were talking about (supercooled water becoming things like snow).

    I wasn’t talking about ice III (that melts at 251.165K) but ordinary ice that melts at 273.15K or 0 degrees celcius(what most people mean when they talk about ice), which was pretty obvious from the context of the discussion.

    From wiki on ice:
    Virtually all the ice on Earth’s surface and in its atmosphere is of a hexagonal crystalline structure denoted as ice Ih (spoken as “ice one h”) with minute traces of cubic ice denoted as ice Ic. The most common phase transition to ice Ih occurs when liquid water is cooled below 0°C (273.15K, 32°F) at standard atmospheric pressure.

    And snow that forms from supercooled water won’t melt at ice iii melting temperatures at atmospheric pressures because it is ice h. So you’re not going to be able to use your enthalpies of fusing supercooled water into ice h as entlapies of melting ice h, just as I said.

    You’re just equivocating ice III with the ice that was being talked about, and now your equivocating the ice h formation (formed at 101 kilo Pascals or less) in your graph with ice iii (formed at 350 mega pascals) . Not very straight up, imho.

    Like I said you criticize arguments I’m didn’t make, and conflating melting with freezing, and you’re equivocating all sorts of concepts.

    But I will criticize an argument you did make:

    dQ/T is rarely informative

    –DNA_jock

    You’ve failed to defend that except by starting arguments about things I didn’t say nor intended to say.

    You want to back up you’re claim, do so directly like citing literature that says what you say:

    dQ/T is rarely informative

    –DNA_jock

    Equivocating and misrepresenting what I say, criticizing arguments I didn’t make (like conflating freezing of supercooled water with melting), conflating the ice iii with ice h, isn’t justification of what you said. It’s evidence you have no straight up arguments to back your claim:

    dQ/T is rarely informative

    –DNA_jock

    You’ve also yet to show computing entropy now a days “counting microstates all the way”. You’ve not done so for the 20grams of typical ice discussed. Instead you’ve wandered off into sidetracks.

    dQ/T is rarely informative

    –DNA_jock

    Why don’t you spend more time backing that claim up rather than criticizing arguments I didn’t make nor intended to make.

  24. Yikes, Sal, even when I agree with you, you still manage to shoot yourself in the foot.
    251.165K, 209.9 MPa is the triplepoint for liquid water, ice III and ice Ih — that’s “ordinary ice” in Sal’s quaint lexicon. So “ordinary ice” can melt at 251.165K. And below 1 atm, the melting point of “ordinary ice” increases slightly — getting all the way up to 273.16K.
    In your 20g of ice challenge, you never specified the pressure, so my question was entirely on point. Your problem is that you are perennially vague in your use of language.
    In any case, your failure to address the points where we disagree is noted, in particular
    ΔHfusion = – ΔHsolidification
    LOL

    “Equivocating”, you keep using that word. I do not think that it means what you think it means.

  25. P.S. Sal,
    As others have noted, your “dQ/T is rarely informative” quote is a quotemine.
    (Unintentional, I am sure! 🙂 )
    What I actually wrote was:

    DNA_Jock: November 1, 2016 at 6:34 pm

    colewd: [ Sal says:] Professionals who work with entropy ( engineers and chemists) use the Claussius [sic] version. The mistake I see in this thread is the insistence there is only one way to define entropy correctly. There is not.

    I am interested to see if others agree with this. The multiple definition issue seems to be the cause of this lively discussion.

    This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts. Clausius / Carnot is used to introduce high school students to entropy using dQ/T to discuss the theoretical efficiency of steam engines (well, ideal gases in pistons), and to explain to teenage chemistry students why endothermic reactions may be favored.

    Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…
    I could ask my physicist and materials scientist buddies what they use, but I somehow doubt that would change anyone’s opinions…

    There is only one way to define entropy correctly: minus rho log(rho).

    We were discussing how scientists think and talk about entropy, not the equipment they use to try and measure it.

  26. keiths:

    You used the wrong probability distribution because you were cribbing from a source that you didn’t understand. You saw the words “entropy” and “pair of dice” and assumed that they were trying to solve the same problem as you. They weren’t.

    Mung:

    I changed the problem I was trying to solve.

    No, you didn’t. You were trying to calculate the entropy of a random dice throw, in which there were 36 equiprobable microstates. You initially got the correct answer:

    S = log2(36) ≈ 5.17 bits

    Then you got confused by the source you were cribbing from and applied the wrong, non-uniform probability distribution to the problem. You then got an incorrect answer of 3.27 bits.

    So given the new distribution I chose for the new problem I chose, why is my calculation of the entropy wrong?

    It wasn’t a new problem. It was the same problem as before, and you foolishly retracted your correct answer and replaced it with a wrong answer.

    Let me put this another way. You’re trying to get a point across about missing information. Right? Why can’t you make your point using the distribution I chose? Does your reasoning only work for a discrete uniform distribution? Because it also ought to work for a non-uniform distribution. Don’t you think?

    The missing information view works just fine with non-uniform distributions. Your mistake was to apply a non-uniform distribution over sums to a problem in which a uniform distribution over microstates was required.

    If you want to understand entropy, you need to understand macrostates, microstates, epistemic probability distributions, and the roles that all of these play in entropy calculations.

    Don’t just crib, Mung. Comprehend.

  27. keiths: You were trying to calculate the entropy of a random dice throw, in which there were 36 equiprobable microstates. You initially got the correct answer:

    S = log2(36) ≈ 5.17 bits

    For a die with 36 sides which can land on any of the 36 faces with equal probability that would still be my answer.

    Now you can proceed with your argument, because surely it does not depend on there being two dice.

    This way we can argue over my change of mind in the case of two dice without it getting in the way of the point you are trying to make. Right?

  28. In your 20g of ice challenge, you never specified the pressure, so my question was entirely on point. Your problem is that you are perennially vague in your use of language.

    I gave the melting temperature and the enthalpy of fusion, so that was hardly vague, and those numbers were pulled from a typical college exam so it’s pretty clear we’re talking typical conditions (aka like where Wikipedia lists the melting temperature at 273.15K and the enthalpy of fusion at 333.55 J/g)!

    And once we start talking about scientists, then dQ/T is rarely informative:

    Lambert is a scientist and so are many others like Dr. Mike who described entropy in terms of dS = dQ/T as well as S = kB ln W, not to the exclusion of one to another. So even your spin of your stupid statement is wrong. If your explanation of your intended meaning above is your explanation, then your stupid statement is still a stupid statement.

    And once we start talking about scientists, then dQ/T is rarely informative:

    dQ/T is more than measuring, it is a macro thermodynamic description of entropy. For example, one can solve the mixing entropy problem of ideal gases without direct measurement using dQ/T as a macro thermodynamic theory.

    We were discussing how scientists think and talk about entropy, not the equipment they use to try and measure it.

    dS = dQ/T is more than just a measurement of entropy, it is a macro thermodynamic way of conceiving of entropy. The Boltzmann description is microscopic.

    We can connect the macro theory (Claussius) with the micro theory (Boltzmann) this way:

    ΔS = Integral (dQ/T) = kB (ln W_final – ln W_initial)

    Therefore dQ/T is informative, and not rarely either as demonstrated by the papers I cited.

    Then you went on and said entropy wasn’t important to engineers designing heat exchangers. You got called on that. No retraction.

    Then I posed simple examples involving melting ice (typical ice) and the enthalpy of melting of this ice at 273.15K, and you conflated that enthalpy with the enthalpy of FREEZING supercooled water at 250K. You got called on the equivocation when I said ice (I h) doesn’t melt at 250K and you then equivocated with I iii melting point when we (even you) were talking about I h ice. Not straight up at all.

    Then you talk about entropy is counting microstates all the way among scientists. So I invite you count microstates of the 20 gram melting ice, instead you equivocated. No wonder all your Options were absurd, you started talking about freezing and dealing with supercooled water which give different entalpies because it is a totally different system than the one I said you should solve.

    So you didn’t solve the simple system of a 20 gram melting ice cube. I gave as much detail as was in the college exam and even provided a link to the college exam. It should be expected you’d answer the question according to a instructors reasonable expectation about how students would interpret we were talking about ice melting at 273K (rounded) as well as having the enthalpy value provided in the exam and elsewhere, not the values you just pull that describe a different scenario.

    You described another system and equivocated the enthalpy numbers of your scenario as somehow appropriate to the scenario I described, you then blatantly equivocated ice I iii with ice I h.

    https://en.wikipedia.org/wiki/Equivocation

    For the reader’s benefit this is equivocation:

    Equivocation (“to call by the same name”) is an informal logical fallacy. It is the misleading use of a term with more than one meaning or sense (by glossing over which meaning is intended at a particular time). It generally occurs with polysemic words (words with multiple meanings).

    Example:
    A feather is light.What is light cannot be dark.Therefore, a feather cannot be dark.

    How DNA_Jock equivocated:

    “Enthalpy of fusion” for melting ice: I was referring to typical normal 1 h ice at 273.15K.

    DNA_Jock equivocated with “Enthalpy of fusion” of freezing supercooled water well below 273.15K at 250K. No wonder his reasoning was absurd because it was a logical fallacy. You shrugged it off with no apology to the readers or me nor an acknowledgement after being called on it.

    Instead, I say the melting point of ice is 273.15K clearly referring to 1 h ice that would be the case for the ice cube and even the snowflakes made from supercooled water.

    In the case of snow flakes this is a special case where enthalpy of freezing is different than enthalpy of melting. Then DNA_Jock equivocates ice iii which has melting temperature of 251.165K as if it is the same ice I was talking about (and even the snowflake ice for that matter that comes from supercooled water).

    You’ve not defended your point, you’re only spinning your stupid statement after getting called on it, and now you’re adding to that equivocations of a special case of freezing enthalpy of supercooled water with melting enthalpy of typical ice and then ice at highly atypical conditions even after it was evident I (and the exam question which I took the example from) was talking typical conditions (like melting of ice at 273.15K).

    Here your stupid statement again. Spin all you want to, but until you retract it, it’s yours.

    dQ/T is rarely informative

    –DNA_jock

  29. stcordova: In the case of snow flakes this is a special case where enthalpy of freezing is different than enthalpy of melting.

    Holy crap! If you’re right about this Sal, you should start packing your bags for the trip to Stockholm. You’ve discovered the basis for a perpetual motion machine!
    Alternatively, you don’t understand Hess’s Law.

    🙂
    .

  30. Mung,

    The mistake you made is getting in the way of your understanding. Don’t be a Sal. Don’t try to brush it aside.

    You chose to recognize each ordered pair (r,g), with r and g independently ranging from 1 to 6, as a distinct microstate. Given that choice, a pair of fair dice, a random throw, and no feedback regarding the result, the appropriate probability distribution for calculating the entropy is a uniform distribution over the 36 microstates. Period. You applied the wrong distribution, so you got the wrong answer.

    If you don’t get this, you’re never going to understand entropy. To calculate the entropy correctly, the distribution must be a distribution over the microstates.

    If instead you try to solve a different problem, in which you define the microstate as the sum of r and g, then you have a different set of microstates and a different, non-uniform probability distribution is appropriate.

    If you pick one set of microstates but then apply the wrong probability distribution, you’ll get the wrong answer.

    Let that sink in. It’s important.

  31. keiths, you are simply not listening. If you aren’t going to listen there’s not much point in continuing, even though I’ve been trying to find a way to grant you your 36 microstates.

    What do you find offensive about a die with 36 sides? Why do you need two dice rather than one?

    Plus, I don’t think you’ve made the case that you’re talking about something that’s analogous to microstates, which is why I had started putting ‘microstate’ in quotes. It would seem to me that you are counting on every microstate being equally probable.

    I’m not being a Sal, I’m not mocking you with inane comments and challenges. I just disagree with you, or perhaps just have not come to agree with you, and yet I am still trying to find a way to move the discussion forward in spite of that disagreement.

    If you won’t give me my 36-sided die, perhaps come up with a different example? You can always color the different faces if you like. Or toss 18 coins.

    Or come up with different terms to use. See the thread Tom started.

    Here’s what I see happening. You have two dice and all the various ways they can be considered to be combined, how they taste, what color they are, how many dots are on a given face, and all these are simply some form of subset of the set of all the “microstates.” But it still depends on your choice of how to order or arrange them, and if that’s the case then my way of ordering them (by the sum) is just as good as your way of ordering them (whatever it is).

    Yet you keep claiming my way is wrong and your way is right. In spite of your claims about Yolanda and Xavier. A contradiction you have yet to resolve.

  32. keiths: Do you understand why the concept of entropy is valuable?

    It’s rare? The stock price is sure to go up? It’s better than the God concept?

  33. You know keiths, throughout most of this thread we’ve managed to get along quite well. I don’t know if that’s because at some fundamental level we agree, or if it’s because we have a common enemy, or what. But I’d like to see us continue this way of getting along. I find it’s far more companionable and productive. It sets a good example, imo.

    So let’s try to find a way to work things out. A way to continue dialogue even if we don’t agree on everything. How about it?

  34. Mung,

    I agree with people when I think they’re right, and I disagree with them when I think they’re wrong. You’re no exception.

    You made an obvious mistake, which I have explained to you. I am not going to pretend that it wasn’t a mistake. You picked the wrong probability distribution to go with your microstates, so you calculated an incorrect entropy value.

    If you are willing to learn from your mistake, I am willing to help you. If not, you are free to remain frustrated and wrong. It’s your choice.

    Let me repeat:

    If you pick one set of microstates but then apply the wrong probability distribution, you’ll get the wrong answer.

    Let that sink in. It’s important.

  35. Mung:

    Here’s what I see happening. You have two dice and all the various ways they can be considered to be combined, how they taste, what color they are, how many dots are on a given face, and all these are simply some form of subset of the set of all the “microstates.”

    No, you have to define how a microstate is specified. Once you’ve done so, then the space of possible microstates is determined. We agreed on an (r, g) specification for the microstate of a pair of thrown dice, with r being the number on the red die and g the number on the green.

    Given that specification, the space of possible microstates includes all 36 combinations of r and g, where each variable ranges independently from 1 through 6. For a pair of fair dice thrown randomly, with no feedback about the result, those 36 microstates are epistemically equiprobable. To calculate the entropy correctly under those circumstances, you must use a uniform distribution in which the microstates are equiprobable. Instead, you chose the wrong distribution — a non-uniform distribution over the possible sums, instead of a uniform distribution over the possible microstates — and got the wrong answer.

    To make progress, you’ll need to accept your mistake and learn from it.

    But it still depends on your choice of how to order or arrange them, and if that’s the case then my way of ordering them (by the sum) is just as good as your way of ordering them (whatever it is).

    You can choose different ways of specifying the microstate. Once you’ve made your choice, however, you must use an epistemic probability distribution over those microstates and not over a different set.

    Yet you keep claiming my way is wrong and your way is right.

    Correct. You screwed up by choosing the wrong probability distribution to go with your (r, g) microstates.

    In spite of your claims about Yolanda and Xavier. A contradiction you have yet to resolve.

    As I keep explaining, the fact that entropy is observer-dependent does not mean that it’s impossible to miscalculate it.

    Seriously, Mung, isn’t that obvious?

  36. keiths: No, you have to define how a microstate is specified.

    Ok, so if I decide to define a ‘microstate’ in terms of the sum of the two faces I can do that? Because the message that I’ve been hearing is that I am not allowed to define how a microstate is specified.

  37. Mung,

    Ok, so if I decide to define a ‘microstate’ in terms of the sum of the two faces I can do that?

    Sure. Just make sure the probability distribution you choose is appropriate for that microstate specification.

    Because the message that I’ve been hearing is that I am not allowed to define how a microstate is specified.

    You haven’t been hearing that from me.

  38. keiths: We agreed on an (r, g) specification for the microstate of a pair of thrown dice, with r being the number on the red die and g the number on the green.

    I don’t recall agreeing to anything involving two different colors of dice, because in my experience with the game of craps in casinos the dice all have the same color.

    Is this distinction in color necessary for your argument?

    If the sum is 12 does it really matter which die is green and which die is red?

    Green is not a subset of the set of possible microstates, and neither is red. Or if they are, would you kindly explain how so?

  39. keiths: Sure. Just make sure the probability distribution you choose is appropriate for that microstate specification.

    ok, thank you.

    How many probability distributions are appropriate for that microstate specification? In what way was the distribution I chose not appropriate for that microstate specification?

  40. keiths: You can choose different ways of specifying the microstate. Once you’ve made your choice, however, you must use an epistemic probability distribution over those microstates and not over a different set.

    ok, thank you.

    I chose a different way of specifying the microstate (as the sum of the dots on the two faces). The probability distribution I used was over those microstates.

    If not, why not?

  41. keiths: I’ve explained your mistake again and again. Please reread my comments.

    It might help if you actually responded to the specific questions asked.

    Start Here:

    Mung: Ok, so if I decide to define a ‘microstate’ in terms of the sum of the two faces I can do that?

    keiths: Sure. Just make sure the probability distribution you choose is appropriate for that microstate specification.

    Q1: How many probability distributions are appropriate for that microstate specification?

    Q2: What are the probability distributions that are appropriate for that microstate specification?

  42. In a thermodynamics based on the first and second laws the only purpose of Clausius’s state variable entropy is to provide comparisons between two states.

    – Lemons, Don S. A Student’s Guide to Entropy

    Sorry Sal.

    The absolute entropy of a single state is much like the absolute energy of a single state. The first and second laws of thermodynamics sanction neither absolute energy nor absolute entropy but assign meaning only to energy and entropy differences.

    – Lemons, Don S. A Student’s Guide to Entropy

    Sorry Sal.

  43. DNA_jock:

    Holy crap! If you’re right about this Sal, you should start packing your bags for the trip to Stockholm.

    The diagram you provided shows different latent heats of fusion depending on temperature when the supercooled liquid water turns to ice, ΔH = Lf. You botched understanding the paper you yourself provide where it says:

    For a melting/freezing transition, Δh = Lf.

    and from wiki on latent heat of fusion:

    The enthalpy of fusion of a substance, also known as (latent) heat of fusion,

    So depending on how one defines enthalpy of fusion in the case of supercooled water, I was right, but it won’t earn a Nobel prize since it is posted in the paper your reference. Do bother trying to understand the papers you cite? 🙄

    But that freezing process won’t be reversed to a melting process if you try to now melt the ice by adding the same amount of latent heat removed during the freezing process at 250K (or something below 273.15K) since you have to take the ice up to 273.15K to melt it at atmospheric pressure.

    So liquid water could supercool to 250K and then turn to a snowflake (or whatever piece of ice), but the snow flake, if it melts, will have to go to 273.15K at atmospheric pressure in order to melt, and at that temperature the enthalpy of fusion when the ice melts is in general not the same as the latent heat of fusion of the supercooled liquid water as it freezes into ice (like a snowflake or whatever). It’s right there in the diagram! Freaking look at the diagram of Lf at about 273.15K where all the lines converge and that is about 6004 J/mol which is 333.55 J/g. It certainly isn’t 333.55 J/g (6004 J/mol) at other temperatures is it? Sheesh!

    But in any case you have an extremely lame excuse for saying I was vague about the 20 gram ice cube example.

    When I did the entropy change calculation for the 20 gram ice cube, Q was in the numerator, and T in the denominator, and Q was 333.55 J/g x 20gram = 6671 J (or some rounded thereof) and T was 273.15 (or some rounded value thereof). Not to mention, this was essentially in the Frasier University chemistry exam I got the example from and also the numbers you’ll get when you google “melting point of ice” and when you visit the enthalpy of fusion wiki which I linked.

    And if I’m talking melting ice into non supercooled water, you should not be citing the enthalpy of fusion numbers for freezing supercooled water into ice. You equivocated. Maybe you made another goofy mistake, but the net result was an equivocation on your part.

    If I said the 20 gram ice cube melted at 273.15K with an enthalpy of fusion at 6661 J, then I talking about a 20 gram ice cube that melted at 273.15K with an enthalpy of fusion at 6661 J. :roll”, not freezing of supercooled water to ice at 250K at some other enthalpy of fusion like that in the paper you cited of about 4000 to 5000 J/mol :roll:.

    So you really have no excuse as to why you’re not able to calculate the entropy change in that 20 gram ice cube by either :

    1. computing the missing information
    2. counting the microstates

    unless of course you can’t actually do the calculations without first resorting to the macro thermodynamic definition of entropy defined through

    dS = dQ/T

    If you and Keiths and Mung can’t do the calculation or the counting, just say so instead of wasting everyone’s time with the equivocations you just got called on.

  44. keiths:

    I’ve explained your mistake again and again. Please reread my comments.

    Mung:

    It might help if you actually responded to the specific questions asked.

    It would help a lot more if you would admit your mistake and try to learn from it, rather than trying to wriggle out of it.

    The probability distribution you use for entropy calculations should accurately reflect whatever relevant information you possess about the microstate. It should correspond to the macrostate, in other words.

    See if you can figure out the correct epistemic probability distribution for each of the following macrostates, assuming fair dice and a random, blindfolded throw:

    I. For two dice, one red and one green, where the microstates are in (r, g) form:

    a) Your friend, an accurate and honest observer, tells you that r + g, the sum of the numbers on the two dice, is greater than eight.

    b) She tells you that the number on the green die, g, is one.

    c) She tells you that the sum is twelve.

    d) She tells you that r and g are the last two digits, in order, of the year in which the Japanese surrendered to the US at the end of WWII.

    e) She tells you that g is prime.

    f) She tells you that r raised to the g is exactly 32.

    g) She tells you that g minus r is exactly one.

    After you’ve answered those, we can move on to scenarios in which the microstates are defined as sums in r + g form.

Leave a Reply