In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths: To understand entropy, you need to understand macrostates.

    Ms. Haw left out the color of the felt and whether or not it was raining at the time.

    By what principle does she determine the allowable macrostates? Earlier you denied that your argument relied on the sums of the dice and yet hear you are relying on the sums of the dice.

  2. Mung:

    Ms. Haw left out the color of the felt and whether or not it was raining at the time.

    That’s right. She’s smarter than you.

    By what principle does she determine the allowable macrostates?

    Unlike you and walto, she understands the concept of macrostates and is able to distinguish relevant information from irrelevant information. Information relevant to a macrostate is information that affects the epistemic probability distribution over the microstates.

    Earlier you denied that your argument relied on the sums of the dice and yet hear you are relying on the sums of the dice.

    It went right over your head, evidently:

    Mung:

    In fact, all of your examples rely on summing the values shown on the dice. Why?

    keiths:

    No they don’t, and they don’t need to.

    Let r be the number on the red die and g the number on the green. The following are all legitimate macrostates:

    1. r and g are the last two digits, in order, of the year in which the Japanese surrendered to the US at the end of WWII.

    2. g is prime.

    3. r raised to the g is exactly 32.

    4. g minus r is exactly one.

  3. walto,

    In the course of this thread:

    a) You disagreed with me regarding the energy dispersal view of entropy, and then had to reverse yourself.

    b) You disagreed with me regarding the missing information view of entropy, and then had to reverse yourself.

    c) You disagreed with me regarding the observer-dependence of entropy, and then had to reverse yourself.

    Now you are disagreeing with me over

    d) the observer-dependence of macrostates, and

    e) the misconception that there is “one correct macrostate”.

    You’re going to have to reverse yourself again — at least if you’re honest.

    Try to cultivate some self-awareness.

    You’re terrible at this stuff.

  4. “Greater than ten” is not a macrostate.
    Ten is a macrostate.
    Eleven is a macrostate.
    Twelve is a macrostate.
    Greater than ten includes “eleven” and “twelve”. And thirteen, and fourteen …

    So “greater than ten” is sort of a “superstate.” Another problem with “greater than ten” is that “eleven” doesn’t have the same probability as “twelve” which rules them out as microstates of “greater than ten” which also rules out “greater than ten” as a macrostate.

  5. Mung,

    You are one confused dude.

    “Greater than ten” is a macrostate. It is associated with an epistemic probability distribution and an ensemble of microstates, from which the entropy can be calculated.

  6. keiths, you are one confused dude!

    What happens in your system when you add up all the entropies of your alleged macrostates?

    macrostate n1 – “greater than one”
    macrostate n2 – “not three”
    macrostate n3 – “greater than one and not three”
    macrostate n4 – “more dots on the green die than on the red die”
    etc. etc.

  7. Mung,

    What happens in your system when you add up all the entropies of your alleged macrostates?

    Who cares?

    Are you cribbing from another source that you don’t understand?

  8. I don’t think I’ve ever seen a better argument for the utter arbitrariness and subjectivity of macrostates. They aren’t merely “observer-dependent” they are entirely subjective.

  9. Nah. You’re just confused about them.

    Interestingly, you and Walto are at opposite extremes. He thinks there’s “one correct macrostate” for any particular system, while you think that macrostates are entirely subjective.

    You’re both wrong.

  10. keiths, by what theory of yours can two different macrostates share the same microstates? Put another way, by what theory of yours does a single microstate belong to two different macrostates?

  11. Mung,

    It’s possible to designate a set of non-overlapping macrostates for a system, but that does not mean that macrostates must be non-overlapping.

    In the dice entropy scenario we’ve been discussing, for instance, the macrostates of the three observers overlap. Each includes the actual microstate, after all.

  12. My good friend walto stands blindfolded at a craps table. He tosses the dice onto the table. I, an honest, accurate witness, tell him, “not an elephant.” Certainly, this is true.

    But how accurate is it?

    Ms. Haw may count a total of eleven dots, she may whisper “greater than ten,” but that’s being ambiguous, not accurate. Not only is “greater than ten” not accurate, it’s also not an observation.

  13. Mung,

    This is not that difficult.

    Because Ms. Haw is an accurate observer, she correctly perceives the microstate. Because she is honest, she makes true statements to me and to walto.

    And crucially, because she is intelligent, she understands the kind of information that is relevant to us and our epistemic probability distributions over the microstates.

    You, of course, lack her intelligence. Because of your limitations, you pass an irrelevant piece of information to walto. We fire you and engage Ms. Haw for all of our future witnessing needs.

  14. keiths: Because Ms. Haw is an accurate observer, she correctly perceives the microstate. Because she is honest, she makes true statements to me and to walto.

    You failed to explain why an honest accurate witness would be intentionally ambiguous, even to the point of telling you one thing and telling walto a different thing.

    My good friend walto stands blindfolded at a craps table. He tosses the dice onto the table. keiths, an honest, accurate blindfolded witness tells him, “less than thirteen!” Certainly this is true. But how accurate is it?

    Ms. Haw seems to be totally unnecessary.

  15. Mung,

    You failed to explain why an honest accurate witness would be intentionally ambiguous, even to the point of telling you one thing and telling walto a different thing.

    It doesn’t matter why she does it. The point is that she can do it while remaining accurate and honest.

    She accurately observes the microstate. She honestly tells walto that the sum is greater than eight. She honestly tells me that the sum is greater than ten.

    Why is this so difficult for you to grasp?

  16. …Boltzmann reformulated the concept of entropy and declared that a system’s entropy is directly proportional to the number of possible distinguishable ways its components can be arranged (or, more precisely, to the logarithm of this number).

    In a system of counting dots four dots plus three dots is indistinguishable from three dots plus four dots which is indistinguishable from five dots plus two dots, in all these cases the number of dots is seven. There was nothing wrong with the way I calculated the entropy. I simply chose to ignore which die had which number of dots.

    keiths simply failed to notice this important distinction.

  17. keiths: Why is this so difficult for you to grasp?

    I blame it on entropy. Missing information. You think entropy is calculated at the receiver and I think it’s calculated at the source.

  18. Mung: “…Boltzmann reformulated the concept of entropy and declared that a system’s entropy is directly proportional to the number of possible distinguishable ways its components can be arranged (or, more precisely, to the logarithm of this number).”

    In a system of counting dots four dots plus three dots is indistinguishable from three dots plus four dots which is indistinguishable from five dots plus two dots, in all these cases the number of dots is seven. There was nothing wrong with the way I calculated the entropy. I simply chose to ignore which die had which number of dots.

    To me, the words “possible distinguishable ways” look important in Boltzmann’s formulation. For example, when a big house is blocking the view to the tree behind the house, the tree (and everything else behind the house) is not part of the calculations to the observer who has never seen behind the house, whereas things look totally different to someone who has peeked behind the house. Same system, different entropy to different observers.

    It all comes down to if the system is static or dynamic, if the observer should be omniscient, what segments we should count as distinct entities and why. That’s why this discussion will never end and next year some genius might formulate entropy yet another way.

  19. Mung,

    There was nothing wrong with the way I calculated the entropy. I simply chose to ignore which die had which number of dots.

    keiths simply failed to notice this important distinction.

    Fercrissakes, Mung. I’ve explained your mistake again and again. Learn from it and move on.

  20. keiths:

    She accurately observes the microstate. She honestly tells walto that the sum is greater than eight. She honestly tells me that the sum is greater than ten.

    Why is this so difficult for you to grasp?

    Mung:

    I blame it on entropy. Missing information. You think entropy is calculated at the receiver and I think it’s calculated at the source.

    There is no missing information at the source. The system is in exactly one microstate and no others.

    If entropy were calculated at the source, it would always be zero. That would make it a useless concept.

  21. Mung: You think entropy is calculated at the receiver and I think it’s calculated at the source.

    This can’t be right. A context where the concept of entropy comes in is deciphering. Entropy is calculated at deciphering, at the receiving end https://xkcd.com/936/

  22. Erik: Entropy is calculated at deciphering, at the receiving end…

    🙂

    It depends on who is observing the message at the receiver and whether they are honest and accurate when they fail to communicate their actual observation and instead communicate something they think is deducible from their observation which makes them a source and the person to whom they communicate their “observation” the receiver.

  23. Mung,

    Yes, I say the same thing: It depends. And in his own pedantic way, keiths is also saying the same thing. The disagreements probably come from that you all haven’t narrowed down the definitions closely enough.

  24. Mung,

    How do you explain the following?

    find the entropy of a source

    They’re talking about the entropy of a message source from the receiver’s viewpoint.

    The source already knows the message. The receiver doesn’t. It’s the receiver’s uncertainty that needs to be reduced, which is presumably why the message is being sent in the first place.

    Messages are analogous to microstates. The receiver computes the source entropy based on the receiver’s epistemic probability distribution over the possible messages, just as the observer computes the entropy based on the observer’s epistemic probability distribution over the possible microstates.

  25. The computer gives signs of becoming the contemporary counterpart of the steam engine that brought on the industrial revolution.

    – Information: A Scientific American Book (1966)

    heh

  26. keiths: They’re talking about the entropy of a message source from the receiver’s viewpoint.

    The entropy of a message source. Finally, something we can agree upon. So I said “at” the source when I should have said “of” the source. That’s the pedantic point you wanted to make?

  27. Erik,

    The disagreements probably come from that you all haven’t narrowed down the definitions closely enough.

    No, the problem is that walto, Mung, and Sal are all clinging to misconceptions about entropy. Walto thinks that there can only be one correct macrostate. Mung and Sal think that entropy is not observer-dependent.

  28. Erik: The disagreements probably come from that you all haven’t narrowed down the definitions closely enough.

    You must be bored. 🙂

    Or have you been following this thread all long?

    Do you have any specific suggestions as to which definitions need to be narrowed down? Because I actually agree with you. I think keiths needs to define what he means by microstate and macrostate. And since neither keiths or walto are actual observers, he probably needs to nail down what he means by “observer-dependent.”

  29. keiths: Mung and Sal think that entropy is not observer-dependent.

    Actually, I am on the fence and looking for a reason to believe that “observer-dependent” isn’t synonymous with subjective.

    More specifically, I don’t think that the probability distributions associated with thermodynamics are observer-dependent. They are what they are.

    I’m unclear as to how “dice entropy” is supposed to clear that up for me.

    It absolutely appears to me that when Ms. Haw whispers one thing to walto and whispers a different thing to keiths that it is most difficult to attribute her choices to something other than a subjective choice.

  30. keiths:

    They’re talking about the entropy of a message source from the receiver’s viewpoint.

    Mung:

    The entropy of a message source. Finally, something we can agree upon.

    The entropy of a message source from the receiver’s viewpoint.

    So I said “at” the source when I should have said “of” the source. That’s the pedantic point you wanted to make?

    No. The point is that when you claim that entropy is calculated at the source, you are completely wrong.

    Entropy is calculated from the observer’s (or receiver’s) viewpoint, and it’s a measure of missing information. The amount of missing information is observer-dependent, and therefore so is entropy.

  31. keiths: The entropy of a message source from the receiver’s viewpoint.

    There is a source. The source has an entropy. I don’t understand why this poses such a problem for you, given that it’s taken directly from your own statements.

    Now you’re supposed to say that the source has many entropies, and that the entropy of the source is determined by the receiver. There can be many receivers, and every receiver calculates the entropy of the source based on a different idea of what the source may produce. Right?

  32. Mung,

    I’ve answered you before, but my explanations went right over your head. I gave up on you once, and I’m close to giving up again. The concepts appear to be too difficult for you.

    I’ll give it another shot, but I need to see some evidence of progress on your part. Try harder.

    It absolutely appears to me that when Ms. Haw whispers one thing to walto and whispers a different thing to keiths that it is most difficult to attribute her choices to something other than a subjective choice.

    The reason for her choice does not matter. What she knows about the microstate is objectively true. What she tells walto is objectively true. What she tells me is objectively true.

    Walto is missing the most information. I am missing less. Ms. Haw is missing none.

    Walto is able to narrow the microstate down to one of ten possibilities. I am able to narrow it down to one of three. Ms. Haw knows the exact microstate.

    For walto, the entropy is log2 (10) ≈ 3.32 bits. For me, it is log2 (3) ≈ 1.58 bits. For Ms. Haw, it is log2 (1) = 0 bits.

    You long ago accepted that entropy is a measure of missing information. Why are you unable to see that people can possess different amounts of information, so that the amount of missing information — entropy — can differ from person to person?

    Is it really news to you that not everyone possesses the same information about everything?

  33. It is fascinating to see how, as the theory of thermodynamics progressed, the focus of interest shifted from what it is possible for a system to do, to what it is possible for an observer to know about the system.

    – Campbell, Jeremy. Grammatical Man: Information, Entropy, Language, and Life.

  34. keiths: Is it really news to you that not everyone possesses the same information about everything?

    I’ve never met anyone who believes that there are elephants on the moon. Is this due to a lack of information?

    This just tells me that you don’t understand the calculation of entropy. Perhaps you don’t even understand the meaning of entropy.

    Shannon Information is not Colloquial Information. Shannon Information is not a measure of Colloquial Information. Do you want me to explain this to you, or do you agree that this is so?

    Wallace and Gromit have information about Cheese.

    It isn’t Shannon Information.

  35. keiths: The reason for her choice does not matter.

    Sure it matters. She had a reason for her choices or she did not have a reason for her choices.

    Assume she had a reason for her choices. Her reason was objective or her reason was subjective.

    Explain why she stated one thing to walto and another (different) thing to keiths if her choices were objective. One would expect, that given an objective reporter, that both keiths and walto would be told the same thing.

    Assume she had no reason for her choices. Her choices are arbitrary. Yet you deny that her choices were arbitrary. Right?

  36. keiths: What she knows about the microstate is objectively true.

    Microstates change. So what she reports at one point in time may actually be false after she reports it.

    In your dice analogy, the shooter did not “seven out” and rolls the dice again. So now your Haw needs to report something else to keiths and walto.

    How does she decide what to report?

  37. Any assignment of probabilities must reflect the uncertainty of the observer, and be “maximally vague” about that uncertainty.

    – Campbell, Jeremy. Grammatical Man: Information, Entropy, Language, and Life.

    Is “greater than ten” more vague than “greater than eight”? Why?

  38. keiths: Any normally-sighted person can discern the microstate in this case. All you have to do is read the numbers off the dice and distinguish red from green. Any person capable of doing so can be a Damon in the world of dice entropy, in other words. If I removed my blindfold I could do so as well, and so could walto, assuming that he is capable of reading dice numbers and distinguishing red from green.

    Exactly. Your argument is based upon an arbitrary distinction.

    I’ve never played a game of craps in which the color of one die was green and the color of the second die was red.

  39. Mung: Is “greater than ten” more vague than “greater than eight”? Why?

    You know how bigger numbers feel more vague because we cannot grasp them properly and numbers ending in zeroes attract the suspicion of being rounded. For some people, this vagueness probably starts with 10. There’s nothing vague from 1 to 9.

  40. It has become clear to me that first-year graduate students respond best to a molecular-level “explanation” of the classic laws, at least upon initial discussion. For them this imparts a more intuitive understanding of thermodynamic potentials and, in particular, entropy and the second law. Moreover, students’ most frequent hurdles are conceptual in nature, not mathematical …

    M. Scott Shell

  41. ΔH will vary linearly with T

    DNA_Jock

    That means ΔH will be straight line plot vs. T. Turns out that isn’t the case according to the diagram DNA_Jock supplied.

    DNA_Jock needs to be able to judge a straight line from a non-straight line. He can print out the graph of ΔH which he himself supplied to this discussion and try to draw a strait line between the points of ΔH at 250K and ΔH at 273K and see if there is a straight line as would be the case if his claim is correct. It’s not as I demonstrated, but he doesn’t seem to comprehend how confused he is.

    But since he’s having some elementary school issues in drawing a simple straight line, here is a youtube video to help him out. 🙂

    https://www.youtube.com/watch?v=o6OB2EOEd-U&feature=youtu.be

  42. So I thought it might be interesting to buy a copy of the Statistical Mechanics textbook Salvador used to see what it has to say, and boy is it interesting.

    All those weird ideas Sal has didn’t come from this textbook!

  43. And if only to add injury to insult, DNA_Jock has the handle “DNA_Jock” as if he’s some sort of expert on DNA. Well, maybe when he was in his prime, but he’s certainly not justified to claim stupid stuff like:

    dQ/T is rarely informative

    I mean, look at this paper that measure delta-Q of DNA at certain temperatures and estimates of entropy change:

    https://www.ncbi.nlm.nih.gov/pubmed/8968604

    Btw, does DNA_jock know what a calorimeter does. Uh, it measures delta-Q from which we can deduce dQ/T. Does DNA_jock know what at thermometer does? Uh, it measures T. With the two measurements, one can compute entropy. Have you learned that yet DNA “jock”.

    There, hope you actually learned something for a change.
    From the abstract:

    Thermal denaturation of the B form of double-stranded DNA has been probed by differential scanning calorimetry (DSC) and Raman spectroscopy of 160 base pair (bp) fragments of calf thymus DNA. The DSC results indicate a median melting temperature Tm = 75.5 degrees C with calorimetric enthalpy change delta Hcal = 6.7 kcal/mol (bp), van’t Hoff enthalpy change delta HVH = 50.4 kcal/mol (cooperative unit), and calorimetric entropy change delta Scal = 19.3 cal/deg.mol (bp),

    Hmm, let’s look at this.

    H = Enthalpy = U + pV

    U = internal energy (“heat” energy to be colloquial, U includes activation energies, ionization energies, mixing energies, vaporization energies, chemical bond energies, etc. )

    p= pressure
    V = volume

    If no change in pV

    then delta-H = delta-U = delta-Q (for simplicity of isothermal processes)

    The change delta-Q in the abstract is:

    Hcal = 6.7 kcal/mol (bp) = delta-Q

    Temperature = T = 75.5 C = (273.15) + 75.5 = 348.65 deg K

    For isothermal isobaric isovolumetric processes
    delta-S = Integral(dQ/T) = delta-Q/T = 6.7 kcal/mol (bp) / 348.65 K = 19.23 cal/deg.mol (bp)

    which looks a lot like the what was reported in the abstract as

    Scal = 19.3 cal/deg.mol (bp)

    Despite this the JOCK (or so he claims) insists:

    dQ/T is rarely informative

    Howler.

  44. And DNA_ “jock” needs some remedial instruction.

    From:
    Thermal stability of DNA by
    R. D. Blake* and Scott G. Delcourt,
    Nucleic Acids Research, 1998, Vol. 26, No. 14 3323–333

    ∆Sij in Table 2a were calculated from ∆Hij/Tij, since
    ∆Gij = 0 = ∆Hij – Tij∆Sij at Tij.

    This implies
    ∆Hij = Tij∆Sij
    which implies

    ∆Sij = ∆Hij / Tij

    for each i,j

    delta-S = delta-H / T

    Hmm,

    delta-S = Delta-H / T

    For isothermal isobaric isovolumetric (isochoric) processes:

    delta-S = integral (dQ/T) = Delta-H /T

    Nevertheless, DNA_ “jock” even when confronted with examples in his own field of DNA says:

    dQ/T is rarely informative

    Glad I could give some remedial training in sophomore-level thermodyanmics.

Leave a Reply