In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

0

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Err, Sal, you are still confusing ΔS with dS.

    According to Salvador Cordova, ∂ΔH/∂T is a second derivative!

    ∂ΔH/∂T = ΔCp = 0 (another way of saying the second derivative is 0)

    I reply

    No, Sal. You seem to have confused the symbols ∂ and Δ.
    …Second derivatives don’t enter into it: ΔH is NOT a derivative, it’s the difference in enthalpies between reactants and products…

    Sal doubles down:

    in the limit of small change, Δ is d (as in differential), and the partials (∂) derivative of a derivative is a second derivative, just as I said.

    Safe to say that none of those professors who admire Sal so much have ever reviewed his blatherings here.
    I’m still waiting for Sal to use strictly dQ/T to explain why a mixture has entropy at absolute zero. Heck, I’d give partial credit if he managed to calculate it (using dQ/T…).

    Strange that Sal never provides any context or a link to where I wrote “dQ/T is rarely informative “. See this page for a nice example of his behavior. 🙂
    It’s almost as if he has something to hide.

    0
  2. DNA_jock:

    Err, Sal, you are still confusing ΔS with dS.

    Nope.

    ΔS = Integral(dS)

    Anyway, nice try. But I don’t say stupid stuff like this and insist it’s right:

    dQ/T is rarely informative

    0
  3. ROFLMAO
    Now I understand how you got your famous nickname!
    Unfortunately, when you went back and edited your comment on this thread to remove your error, deleting the “=dQ” from this sentence

    then delta-H = delta-U = delta-Q = dQ (for simplicity of isothermal processes)

    so that you could then claim that you never made such an error, you could not go back and delete the same error from the original comment you made on the original thread.
    Do you remember when I wrote:

    You also have an unfortunate habit of transparently attempting to re-write history to paint yourself in a favorable light.

    Thank you for the prompt, and delightfully petulant, vindication.

    0
  4. you could then claim that you never made such an error, y

    I make lots of errors, but I admit them and clean them up.

    But if I say something wrong, I admit it and clean it up.

    Unlike someone who says:

    dQ/T is rarely informative

    and gets called on it, and then tries to save face and pretend there is no error.

    0
  5. keiths:
    Damn, Sal.What is wrong with you?

    So Keiths, do agree with DNA_Jock:

    dQ/T is rarely informative

    0
  6. DNA_Jock:

    Now I understand how you got your famous nickname!

    Would you and the other mods object if I changed that name (Slimy_Sal) to my handle here at TSZ instead of stcordova?

    0
  7. Since that Salvador was talking about howlers, I looked and looked and found DNA_Jock’s comment. The one Salvador keeps “quoting.”

    As I suspected, Salvador failed to understand DNA_Jock’s point. Why did he fail? Because Salvador doesn’t understand that context is important. Why doesn’t Salvador understand that context is important? Because Salvador’s unable to follow context. By the time Salvador reads a sentence, he has already forgotten the previous one.

    Howler indeed. Only committed by Salvador who keeps shooting himself in the foot, across many comments for good measure, because Salvador is too eager to try and ridicule those who correct him, and too stupid to realize that he’s only ridiculing himself.

    0
  8. Sal,

    I make lots of errors, but I admit them and clean them up.

    Did you happen to forget the “admit them” part this time?

    Sal, to Jock:

    Would you and the other mods object if I changed that name (Slimy_Sal) to my handle here at TSZ instead of stcordova?

    That would be redundant.

    0
  9. Oh dear, it is worse than I thought. Far worse.
    After Sal copied this post from the “Evolution affirms the consequent” thread, he did a goodly dose of copy editing to the version on this thread, in order to hide his repeated schoolboy error confusing Δ (the finite difference between reactants and products) with d or ∂ (the infinitesimal change in any property)
    Things that Sal realized he needed to fix:
    “I mean, look at this paper that measure dQ of DNA ”
    “Btw, does DNA_jock know what a calorimeter does. Uh, it measures dQ”
    “Hcal = 6.7 kcal/mol (bp) = dQ”
    “then delta-H = delta-U = delta-Q = dQ (for simplicity of isothermal processes)”
    and my personal favorite, the painfully wrong:

    …348.65 deg K
    dS = dQ/T = 6.7 kcal/mol (bp) / 348.65 K = 19.23 cal/deg.mol (bp)

    which, in a flurry of ass-covering, got converted to the erudite

    …348.65 deg K
    For isothermal isobaric isovolumetric processes
    delta-S = Integral(dQ/T) =
    delta-Q/T = 6.7 kcal/mol (bp) / 348.65 K = 19.23 cal/deg.mol (bp)

    That was the only mention of integration in that post-hoc edited comment…
    The most interesting thing about Sal’s behavior is that this editing demonstrates that he recognized he had made an error.

    0
  10. DNA_Jock

    that he recognized he had made an error.

    Of course I recognized it, which is more than I can say for what you do when you say:

    dQ/T is rarely informative

    And then given a chance to repent, you don’t.

    Would Keiths, teach that crap to others if given the chance. Keiths, keiths, where are you.

    0
  11. Keiths where are you?

    Would you feel comfortable saying this kind of crap:

    dQ/T is rarely informative

    0
  12. Sal:

    Keiths where are you?

    I’m not going to rescue you, Sal. Enjoy your time in the hot seat.

    0
  13. keiths:
    Sal:

    I’m not going to rescue you, Sal.Enjoy your time in the hot seat.

    Rescue me?

    I think you need to come to the defense of DNA_Jock, because he’s now spewing scientific abominations like:

    dQ/T is rarely informative

    You need to set him straight. You’re not going to let him get away with a falsehood are you, just because he’s on your side. That would make you look kind of biased…

    I mean, look at this from the NASA website on entropy, do you see the equation I circled in green.

    https://www.grc.nasa.gov/www/k-12/airplane/entropy.html

    Where did DNA_Jock learn thermodynamics? Evergreen state. 🙂

    0

  14. Sal,

    If you want to critique Jock’s comment, do so in context. Have the integrity to quote the whole thing.

    If you can come up with a valid criticism under those conditions, then I’ll consider it.

    0
  15. keiths:
    Sal,

    If you want to critique Jock’s comment, do so in context.Have the integrity to quote the whole thing.

    If you can come up with a valid criticism under those conditions, then I’ll consider it.

    Well here is the context:

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-26/#comment-148190

    . Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…

    Ok, let’s start with this howler:

    . Engineers who design heat exchangers care about rates of heat transfer, not the entropy.

    Contrast with some things in peer review:
    https://www.sciencedirect.com/science/article/abs/pii/S0360544218310697

    Abstract
    Entropy generation analyses on the transient processes of heat exchangers can guide their designs and operations. A dynamic model of a typical recuperative heater is developed based on mass, energy, and momentum conservation equations. Dynamic behaviors of the heater during transient processes are analyzed based on the second law of thermodynamics. The real-time entropy generation rate due to the heat transfer between the work medium and metal surfaces, and the heat conduction in metals are presented and discussed. A cold fluid flow rate with 20% step increase is adopted as the boundary disturbance, and dynamic performances are obtained. Additional entropy is generated in the heater during the transient processes compared with stationary work conditions. Several design and operation factors of the heater are discussed. Calculation results show that the additional total entropy generation diminishes during the transient processes with the increase in the thermal diffusivity of metal. This rule is also suitable for the influence of thermal transfer resistance between the fluid and metal. By contrast, the metal thickness and specific heat capacity of hot work fluid have opposite influences on the total additional entropy generation of the heater during the transient processes.

    This is where DNA_ “jock” just got taken:

    0

  16. Sal,

    You did it again. You left out the rest of the paragraph containing the sentence in question. Here’s the full paragraph:

    Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.

    In context, it’s clear that Jock means something like the following:

    Just as engineers are likely to care more about the economic efficiency of a process than its thermodynamic efficiency, they are also likely to care more about rates of heat transfer than entropy when designing heat exchangers.

    So no, I don’t think you’ve identified a “howler”.

    0

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.