In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Let Mung and Keiths show their procedure for computing increase in “missing information” entropy when a 20 gram cube of ice melts.

    Go on guys. You claim it’s a practical definition so show how you compute this missing information.

    In contrast, if you do an experiment to get the enthalpy of fusion like this high school experiment:

    https://www.greenwichschools.org/uploaded/faculty/andrew_bramante/GHS_Honors_Chem_Heat_of_Fusion_Experiment.doc

    you should get a value of 333.55 J/gram

    https://en.wikipedia.org/wiki/Enthalpy_of_fusion

    If you have a 20 gram ice cube, simply divide the enthalpy of fusion energy by the temperature of melting (273.15 K) to get the entropy change according to Claussius:

    dS = dQ/T

    thus entropy S is:

    S = Integral (dQ/T) = Q/T for constant temperature of 273.15K

    Q = (20 grams x 333.55 J/gram) = 6671 J

    thus,

    S = Q/T = 6671 J/ 273.15 K = 24.42 J/K

    which is (minus some rounding errors) the figure I’ve shown repeatedly. (Frazier University used 333 J/g as the Enthalpy of fusion, hence a slightly smaller number of 24.39 J/K).

    Mung and Keiths have yet to show how they compute “missing information”.

    I mean, how do you guys measure your own ignorance about the details of the melting of 20 gram ice cube. Not so easy, eh? That’s because your “missing information” approach has little practical methods, it’s all handwaving.

    Go on, show the readers how you figure out how ignorant you are about melting ice cubes. Once you put a number on your amount of ignorance (aka “missing information”) , translate that into thermodynamic entropy. At least get in the ball park of:

    24.39 J/K to 24.42 J/K

    But you guys are nowhere to be seen when it comes to simple questions like this high school example. Pathetic!

  2. keiths:

    You yourself have admitted that the “missing information” interpretation is correct.

    Sal:

    But not to the exclusion of other definitions like you’re doing…

    The “entropy as disorder” and “entropy as energy dispersal” definitions are incorrect, Sal. I reject them because they’re wrong.

    It’s that simple.

  3. Here’s the elephant in the room, Sal. If you want to defend the energy dispersal interpretation of entropy, you need to address this comment:

    Poor Sal. This discussion has been going on for almost a month, and he still doesn’t get that the dispute is over the interpretation of entropy, not its calculation.

    colewd,

    Since Sal has put himself in the awkward position of ignoring (or pretending to ignore) my comments, how about quoting this one to him?

    If entropy is a measure of energy dispersal, he should be able to

    1) respond to each of my six points against dispersalism, including the rather obvious problem that entropy has the wrong units for energy dispersal;

    2) explain why Xavier and Yolanda see different entropy values despite looking at the same physical system with the same physical energy distribution;

    3) explain why entropy increases in Denker’s grinding wheel example, though energy dispersal does not; and

    4) explain why entropy “cares” about the distinguishability of particles, when energy dispersal does not.

    Let’s see your refutation, Sal.

  4. Also, I’m astonished that you still don’t understand the meaning of ‘dispersal’.

    You keep referring to Clausius as if dQ/T were a measure of energy dispersal:

    Many of these I worked out through energy dispersal, which is formally:

    dS = dQ/T

    It isn’t. For energy to disperse, it needs to spread out over a larger volume. dQ/T is energy divided by temperature. Are you going to argue that the energy is spread out over a large number of kelvins, and that counts as dispersal? It’s inane.

    If entropy were a measure of energy dispersal, it would have the units of energy dispersal: energy per volume.

  5. Keiths,

    Too funny you’re quoting Dr. Mike since you said his “energy spreading” definition of the 2nd law of thermodynamics was wrong.

    Keiths, Mung, and DNA_jock:

    Show the readers how you figure out how your ignorance about the details of a 20 gram ice cube increases after it melts, quantify that amount of ignorance (aka “missing information”) in bits and then show how that relates to the standard answer that should be in the ball park of:

    24.42 J/K

    Still no answers after 1500 some comments? I showed the energy dispersal answer, now your turn guys to show the answer with your “missing information” procedure.

    🙂 🙂 🙂 🙂 🙂

    And don’t forget what you learned in the DNA_Jock school of thermodynamics:

    [the Claussius version of entropy in terms of] dQ/T is rarely informative

    — DNA_Jock

    So show all those students of science how it really should be done instead of calculating entropy change the Old School way with thermometers and calorimeters.

  6. keiths: For energy to disperse, it needs to spread out over a larger volume

    I take it that “energy dispersal” is considered by the prevailing theory to be equivalent to dQ/T though, isn’t it? Based on the prevailing theory of heat, you find one, you get the other.

    Now, you disagree with this equivalency claim, I know, because you have mentioned it about 730 times, but so fucking what? It’s a quibble. Find something useful to do, like maybe answer my questions, do Sal’s calculations, or Mung’s programming request, or patrick’s challenge.

    Everyone knows your quibbling fucking view on this matter. It’s uninteresting, and not only because completely trivial and useless. There’s also the fact that you’ve repeated it over 700 times now. It’s an old, useless, repetitive quibble which you obviously cannot defend, but will just repeat until everyone goes away.

  7. colewd:
    keiths,

    Can you challenge Sal’s argument that the missing information definition is not of much practical use?

    Firstly, he doesn’t have an argument, just an assertion.
    People use the Clausius equations because they are a reasonable approximation to reality under most circumstances and the math is easier.
    There are two problems with Sal’s “practical” application of Clausius entropy concepts. 1) Clausius does NOTHING to explain WHY “heat cannot, of itself, pass from…” 2) Under certain circumstances, Clausius gives the wrong answer: you need to understand the Law of Large Numbers reasons that underpin entropy in order to be able to spot when the simple math could be leading you astray — the analogy to Newton’s Laws is apt — good enough for Apollo, but not for GPS, and you can bet your bottom dollar someone at NASA calculated γ for the moonshot, just to make sure.
    3) It’s a fallacy to judge a theory based on…
    Ugh!
    I’ll come in again.

  8. stcordova,
    Cool calculation, bro.

    But what if the temperature is, say, 250 K ?
    (This is an important question in meteorology.)

    You could
    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.
    Or
    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)
    Or
    C) you could measure it with one of those fancy DSC machines.

    How many different answers will you get and, most importantly, WHY?

  9. walto: I take it that “energy dispersal” is considered by the prevailing theory to be equivalent to dQ/T though, isn’t it? Based on the prevailing theory of heat, you find one, you get the other.

    I don’t wish to get in the middle of your flame war conversation, but this is an over-simplification.
    The ‘dispersalists’ need to come up with fancy-schmancy definitions of “dispersal” to explain entropy of mixing. Clausians have to back into the fact that there must be an entropy of mixing, because you have to expend “useful work” to unmix stuff.

  10. DNA_Jock,

    I can see that it might be hard to define ‘dispersion’ in such a way that it does everything the traditional Clausian might want. Also that there’s no real point to doing so or for staking much on defending that view. But I must say that this fight seems silly to me.

    The pointlessness (and this fucking election) is my excuse for being so flamey. Sal is not going to respond to your corrections and Mung will continue to take pot shots at Sal and keiths is not going to respond to anything, and the entire issue seems so trivial that it could be sent to the BBB for mediation and be resolved in 8 minutes–if only the participants had the slightest interest in any meeting of the minds. The prevailing attitudes distress me.

    So, in a word, yes, I over-simplified.

  11. keiths as quoted by Walto:

    For energy to disperse, it needs to spread out over a larger volume

    and from Walto:

    I take it that “energy dispersal” is considered by the prevailing theory to be equivalent to dQ/T though, isn’t it?

    From Lambert’s writings:

    “dispersal of energy” in macro thermodynamics occurs in two ways:

    (1) “how much” involves heat transfer, usually ΔH, to or from system and thus from or to its surroundings;

    (2) “how widely” means spreading out the initial energy of a system in some manner, e.g., in gas expansion into a vacuum or mixing of ideal gases, the system’s volume increases and thereby the initial energy of any component is simply more dispersed in that larger three-dimensional volume without any change in motional or other energies. In chemical reactions, ΔG is the function that must be used to evaluate the net energy dispersal in the universe of system and surroundings. In macro thermodynamics, both spontaneous and non-spontaneous processes are ultimately measured by their relationship to ΔS = qrev/T, where q is the energy that is dispersible/dispersed.

    http://entropysite.oxy.edu/entropy_isnot_disorder.html#m3

    I’ve used capital “Q” to mean qrev (q-reversible).

    Contrast the words of a retired professional chemist and chemistry professor like Lambert with DNA_Jock’s school of thermodynamics:

    [the Claussius version of entropy in terms of]

    dQ/T is rarely informative

    — DNA_Jock

    So, let’s take Lambert’s own words and apply them to the 20 gram melting ice cube.

    (1) “how much” involves heat transfer, usually ΔH, to or from system and thus from or to its surroundings;

    ΔH = change in enthalpy

    From wiki:

    Enthalpy (Symbol: H) is a measurement of energy in a thermodynamic system. It is the thermodynamic quantity equivalent to the total heat content of a system. It is equal to the internal energy of the system plus the product of pressure and volume.

    In the case of condensed matter like liquid water and ice, the pressure and volume can be mostly neglected in the computation of enthalpy of fusion (even though there is a slight change in volume when ice melts).

    The enthalpy of fusion per gram is 333.55 J/g as attested here:
    https://en.wikipedia.org/wiki/Enthalpy_of_fusion

    Thus for a 20 gram ice cube melting at 273.15K, the enthalpy of fusion is

    ΔH = 20 g x 333.55 J/g = 6671 J

    Since this is an isothermal, isochoric, isobaric process

    ΔH = ΔQ = 6671 J

    thus the change of entropy at 273.15K is:

    Integral(dQ/T) = ΔQ /T = 6671 J/ 273.15 K = 24.42 J/K

    I’ve thus shown one can take the sense of what Lambert actually said (versus Keiths’ misreading, mis-interpretation, and otherwise uncharitable manglings) and actually apply it to solve a practical question of entropy.

    Keiths could try to explain how he computes the change in his level of ignorance (aka “missing information”) when the ice melts. But shockingly he doesn’t seem capable without first resorting to what Lambert described as energy dispersal:

    (1) “how much” involves heat transfer, usually ΔH, to or from system and thus from or to its surroundings;

    Also, from a retired chemistry professor (Ernest Z) who taught organic chemistry for 33 years:

    https://socratic.org/questions/what-is-the-difference-between-entropy-and-enthalpy

    ΔS = q_rev/T = (ΔH)/T

  12. Walto:

    Sal is not going to respond to your corrections

    In deference to you Walto, I will respond to DNA_Jock’s “corrections”.

    But in deference to you, despite the fact that DNA_Jock says something as idiotic as this doesn’t really deserve many responses:

    [the Claussius version of entropy in terms of]

    dQ/T is rarely informative

    — DNA_Jock

    The way energy dispersal is defined by Lambert I already cited, but I’ll cite again. I can apply Lambert’s definition both to melting ice cubes (above) and mixing scenarios (done a buzzillion times repeatedly):

    dispersal of energy” in macro thermodynamics occurs in two ways:

    (1) “how much” involves heat transfer, usually ΔH, to or from system and thus from or to its surroundings;

    (2) “how widely” means spreading out the initial energy of a system in some manner, e.g., in gas expansion into a vacuum or mixing of ideal gases, the system’s volume increases and thereby the initial energy of any component is simply more dispersed in that larger three-dimensional volume without any change in motional or other energies. In chemical reactions, ΔG is the function that must be used to evaluate the net energy dispersal in the universe of system and surroundings. In macro thermodynamics, both spontaneous and non-spontaneous processes are ultimately measured by their relationship to ΔS = qrev/T, where q is the energy that is dispersible/dispersed.

    http://entropysite.oxy.edu/entropy_isnot_disorder.html#m3

    I’ve said as much in several ways, but not a succinctly as that paragraph by Lambert, because frankly I didn’t read that part of Lambert’s writings to know what he actually meant versus Keiths and DNA_Jock’s distortions of what Lambert said.

  13. Thanks, Sal, but to adequately respond to Jock, I think you’ll need to take him off ignore and answer the specific queries in his prior couple of posts. They seem to me quite marginal complaints, calling for minor elaborations or constraints, but I’m hardly an expert.

  14. walto,

    Concepts matter in science, even when angry old insurance regulators disagree. There’s a reason scientists didn’t split the difference with the phlogiston theorists and “go have some pbjs”. The phlogiston idea was wrong, so they got rid of it. Completely. Even if the angry old insurance regulators of the time disagreed.

    Entropy is not a measure of energy dispersal. You have a hard time understanding why that matters. That’s understandable, because you describe yourself, accurately, as someone who is “scientifically ignorant” and who finds this stuff “complicated” and “difficult”.

    Pedant got it right:

    Walto, for whom I have affection, is out of his depth in this discussion.

    Fess up, Walto. You’ll feel so much better.

    You may not understand why the concept of entropy matters — why it’s not just a quibble — but that’s to be expected from someone who is out of his depth.

  15. walto,
    I quite understand. My aim is merely to demostrate that Sal shows no inkling that his understanding of these concepts goes beyond a High School introduction to entropy. My seemingly ‘marginal complaints’ are designed to guide him towards a better understanding, were he to actually engage: what is the get-out-of-jail card that allows dispersalists to square the almost-identical-gas mixing paradox; and experimentally determined heat capacities are a poor basis for understanding anything.
    In this context, drawing an equivalence between dQ/T and dispersalism is letting him of the hook.
    And yes, I recognize that
    p(hell freezes over) > p(Sal engages) ~= P(keiths admits…)
    I think I left my copy of Don Quixote in my coat pocket…

  16. Walto:

    Thanks, Sal, but to adequately respond to Jock, I think you’ll need to take him off ignore and answer the specific queries in his prior couple of posts. They seem to me quite marginal complaints, calling for minor elaborations or constraints, but I’m hardly an expert.

    Ok, in deference to Walto, I’ll respond a little since some readers are wanting to see loose ends tied up and since DNA_Jock unlike Keiths and Mung has chemistry background.

    Cool calculation, bro.
    ….
    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    If ∂ΔH/∂T = ΔCp = non-zero, A and B are not the same scenario so why would you expect entropies to be the same?

    You could

    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    Thermochemistry laws apply to reaction with products and reactants, which really doesn’t apply to H2O ice at 250K that stays H2O ice at 250K!

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Clearly wrong, since in this case if “ΔH will vary linearly with T”

    ∂ΔH/∂T = ΔCp = 0 (another way of saying the second derivative is 0)

    thus

    ∂H/∂T = Cp = constant

    and entropy change in this case is

    ΔS = Cp Integral[ ln (T_final/ T_initial) ]

    Now consider….

    Scenario 1:

    T_final = 201K
    T_Initial = 200K

    ΔS_1 = Cp Integral[ ln (201/200) ]

    vs.

    Scenario 2:
    T_final = 249K
    T_Initial = 250K

    ΔS_2 = Cp Integral[ ln (250/249) ]

    Clearly ΔS_1 is not equal to ΔS_2.

    Even though in both scenarios ΔH and Cp are the same, ΔS is not, therefore ΔS is temperature dependent which shows a contradiction in your premise, therefore your premise is non-sensical!

    A) assume the etropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Because of the math, only a few of the readers might have realized you just made a non-sensical statement. But you know it now, don’t you?

    🙂 🙂

    So now’s your turn to do the calculation Keiths and Mung can’t do with “missing information” procedures.

    Compute entropy change of 20gram ice cube melting. And don’t use dQ/T since you said it is rarely informative and because it isn’t defined in terms of missing information.

  17. For walto’s and colewd’s benefit, I will try to walk through Sal’s mangled prose.

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Clearly wrong, since in this case if “ΔH will vary linearly with T”
    ∂ΔH/∂T = ΔCp = 0 (another way of saying the second derivative is 0)

    No, Sal. You seem to have confused the symbols ∂ and Δ.
    If you (Sal) assume that the entropy change (ΔS) is temperature-independent, then “ΔH will vary linearly with T”. Another way of saying this would be to note that “∂ΔH/∂T = ΔCp”, where ΔCp is constant, but non-zero. This is high school math: if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”; if ∂y/∂x = 0 then “y is independent of x”. Second derivatives don’t enter into it: ΔH is NOT a derivative, it’s the difference in enthalpies between reactants and products (water and ice in this case). Egads!

    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    If ∂ΔH/∂T = ΔCp = non-zero, A and B are not the same scenario so why would you expect entropies to be the same?

    You could

    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    Thermochemistry laws apply to reaction with products and reactants, which really doesn’t apply to H2O ice at 250K that stays H2O ice at 250K!

    You are correct that A and B are not the same scenario, they are two different ways of using Clausius to calculate ΔH for the same scenario: a phase transition that occurs at a non-standard temperature. And they differ. This much you got right. But Kirchhoff is merely a specific case of Hess’s Law, which most certainly applies to phase transitions; it’s a restatement of the First Law.
    Not sure how much point there is in Fisking the rest of your comment, but I will note

    Even though in both scenarios ΔH and Cp are the same, ΔS is not, therefore ΔS is temperature dependent which shows a contradiction in your premise, therefore your premise is non-sensical!

    You haven’t even managed to get to the point where scenarios A and B differ, which is
    ΔCp is a function of temperature.

    More importantly, you entirely misconstrued my comment. I was illustrating an example that is problematic for your simple “plug and play” “∂Q/T is all that matters” Clausian view of entropy, and I was asking you (Sal) how you (Sal) would handle this problem using your Clausian approach: you could do A, or B or C.
    And yes, you (Sal) would get “non-sensical” results.
    As I asked in my previous post :
    How many different answers will you get and, most importantly, WHY?
    P.S. Have you figured out the get-out-of-jail card that allows dispersalists to square the almost-identical-gas mixing paradox yet? I don’t think you’re going to like the answer.
    P.P.S. Why does a mixture have entropy at absolute zero?

  18. DNA_Jock: For walto’s and colewd’s benefit, I will try to walk through Sal’s mangled prose.

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Clearly wrong, since in this case if “ΔH will vary linearly with T”
    ∂ΔH/∂T = ΔCp = 0 (another way of saying the second derivative is 0)

    No, Sal. You seem to have confused the symbols ∂ and Δ.
    If you (Sal) assume that the entropy change (ΔS) is temperature-independent, then “ΔH will vary linearly with T”. Another way of saying this would be to note that “∂ΔH/∂T = ΔCp”, where ΔCp is constant, but non-zero. This is high school math: if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”; if ∂y/∂x = 0 then “y is independent of x”. Second derivatives don’t enter into it: ΔH is NOT a derivative, it’s the difference in enthalpies between reactants and products (water and ice in this case). Egads!

    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    If ∂ΔH/∂T = ΔCp = non-zero, A and B are not the same scenario so why would you expect entropies to be the same?

    You could

    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    Thermochemistry laws apply to reaction with products and reactants, which really doesn’t apply to H2O ice at 250K that stays H2O ice at 250K!

    You are correct that A and B are not the same scenario, they are two different ways of using Clausius to calculate ΔH for the same scenario: a phase transition that occurs at a non-standard temperature. And they differ. This much you got right. But Kirchhoff is merely a specific case of Hess’s Law, which most certainly applies to phase transitions; it’s a restatement of the First Law.
    Not sure how much point there is in Fisking the rest of your comment, but I will note

    Even though in both scenarios ΔH and Cp are the same, ΔS is not, therefore ΔS is temperature dependent which shows a contradiction in your premise, therefore your premise is non-sensical!

    You haven’t even managed to get to the point where scenarios A and B differ, which is
    ΔCp is a function of temperature.

    More importantly, you entirely misconstrued my comment. I was illustrating an example that is problematic for your simple “plug and play” “∂Q/T is all that matters” Clausian view of entropy, and I was asking you (Sal) how you (Sal) would handle this problem using your Clausian approach: you could do A, or B or C.
    And yes, you (Sal) would get “non-sensical” results.
    As I asked in my previous post :
    How many different answers will you get and, most importantly, WHY?
    P.S. Have you figured out the get-out-of-jail card that allows dispersalists to square the almost-identical-gas mixing paradox yet? I don’t think you’re going to like the answer.
    P.P.S. Why does a mixture have entropy at absolute zero?

    Thank you, Jock.

  19. No, Sal. You seem to have confused the symbols ∂ and Δ.

    in the limit of small change, Δ is d (as in differential), and the partials (∂) derivative of a derivative is a second derivative, just as I said.

    Your math is in error, and I showed it, change entropy is not the same for all temperatures, therefore change of entropy is not independent of temperature, contrary to your nonsense.

    Show for the readers the change of entropy for the same ΔH at 200K and then at 249K.

    Scenario 1:

    ΔH = Cp ΔT = Cp (1K) = Cp K

    T_final = 201K
    T_Initial = 200K

    ΔS_1 = Cp Integral[ ln (201/200) ]

    vs.

    Scenario 2:
    ΔH = Cp ΔT = Cp (1K) = Cp K

    T_final = 249K
    T_Initial = 250K

    ΔS_2 = Cp Integral[ ln (250/249) ]

    Do I have to spoon feed you the fact

    ln (201/200) does not equal ln (250/249) ??

    A) assume the etropy change is temperature-independent, thus the ΔH will vary linearly with T.

    I just showed this can’t be true in general.

    Do I have to spoon feed you elementary math too? Get it through your head, even with the same ΔH (one that varies linearly with T):

    ln(201/200) does not equal ln(250/249)

    or shorthand

    ln(201/200) != ln(250/249)

    So how can entropy change be invariant with temperature with the same amount of heat added to each scenario? It can’t. You got your math wrong.

    Go ahead, show entropy change at various temperatures for the same amount of Enthalpy Change ΔH, you won’t get the same ΔS.

    You don’t like the temperatures I’ve chosen, pick your own, but make sure ΔH is the same for each scenario, and you’ll still be in error.

    Don’t bother asking more stupid questions until you fix the incoherence in your questions.

  20. DNA_Jock,

    There are two problems with Sal’s “practical” application of Clausius entropy concepts. 1) Clausius does NOTHING to explain WHY “heat cannot, of itself, pass from…” 2) Under certain circumstances, Clausius gives the wrong answer: you need to understand the Law of Large Numbers reasons that underpin entropy in order to be able to spot when the simple math could be leading you astray — the analogy to Newton’s Laws is apt — good enough for Apollo, but not for GPS, and you can bet your bottom dollar someone at NASA calculated γ for the moonshot, just to make sure.
    3) It’s a fallacy to judge a theory based on…

    I see your analogy and have studied General Relativity. General Relativity has been experimentally validated and the model can be used to calculate predictive values of space time curvature. How can Boltzman’s equations be used so we can better understand the thermodynamics of a melting ice cube?

    I just don’t get the value of measuring ignorance.

  21. colewd:

    I just don’t get the value of measuring ignorance.

    Do you understand why the concept of entropy is valuable?

  22. You haven’t even managed to get to the point where scenarios A and B differ, which is ΔCp is a function of temperature.

    BULL!

    for B, I pointed out:

    ΔCp = non-zero

    Fro A, “ΔH will vary linearly with T”

    ∂ΔH/∂T = ΔCp = 0

    which implies

    Cp = constant

    I got the point, you just can’t bring your self to see that’s what I wrote!

    In scenario A, Cp is constant, but that won’t work because this leads to absurdities like saying

    ln(201/200) = ln(250/249)

    which is false. You math is in error.

  23. Sal, I have the sense that you and Jock are so close at this point that if you went to lunch and brought a tablet and pen you could iron your differences out in minutes.

    I think most of the trouble is the medium. It encourages cryptic ticklers like Jock’s “get-out-of-jail card.” And there’s the endless repetition (which may or may not be for the benefit of those imagined to have just awakened from a three-month doze). There are language issues too, and, of course there is the bad blood/history, which, presumably, is the reason for most of Mung’s pot-shotting. It’s too bad.

    Hence, lunch!

  24. colewd: I see your analogy and have studied General Relativity. General Relativity has been experimentally validated and the model can be used to calculate predictive values of space time curvature. How can Boltzman’s equations be used so we can better understand the thermodynamics of a melting ice cube?

    As the analogy shows, you are asking the wrong question.
    Boltzmann/Gibbs has been experimentally validated (this paper is a particularly fun example of the failure of naive dispersalism)
    In the analogy,
    Clausius = Newton
    Boltzmann/Gibbs = Einstein

    Clausius is a satisfactory approximation for melting ice (at Standard Temperature and Pressure, heh), and Newton is a satisfactory approximation for Apollo 11 or the flight of terrestrial cannon-balls.
    However, if you are designing a GPS system or studying dipole traps, then Newton or Clausius will set you wrong.
    Asking “How can Boltzman’s equations be used so we can better understand the thermodynamics of a melting ice cube?” is analogous to asking
    “How can Einstein’s equations be used so we can better understand the flight of a terrestrial cannon-ball?”
    The answer in both cases is “With great difficulty. So what?”

  25. walto,

    ‘Fraid not walto. I’m never going to be on the same planet as someone who can repeat

    “Fro A, “ΔH will vary linearly with T”

    ∂ΔH/∂T = ΔCp = 0

    which implies

    Cp = constant

    ∂ΔH/∂T is non-zero, and equal to ΔCp (i.e. the difference in heat capacities between the reactants and products.
    Heat capacity is a function of temperature (Sal does not appear to have ever learnt this…). This means that ΔCp may or may not be a function of temperature
    (imagine that the reactants and products have heat capacities of {r + a.lnT} and {p + a.lnT} respectively; they both vary with temperature, but their difference does not).
    Under the assumption that ΔCp is constant, then “ΔH will vary linearly with T”. Sal is failing high school math…if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”.
    He actually is confusing the symbols ∂ and Δ.

  26. stcordova: Don’t bother asking more stupid questions until you fix the incoherence in your questions.

    Careful Jock. You’ll end up back on Ignore.

  27. stcordova: Also, from a retired chemistry professor (Ernest Z) who taught organic chemistry for 33 years:

    I like this guy!

    Entropy, S, is a measure of the disorder or randomness of a system.

    An ordered system has low entropy. A disordered system has high entropy.

  28. I was brought up to believe that there is no such thing as a stupid question.

    On the other hand, there are plenty of curious idiots.

    What depresses me is the high number of incurious idiots.
    Today of all days. Sigh.

  29. DNA_Jock:
    walto,

    ‘Fraid not walto. I’m never going to be on the same planet as someone who can repeat

    ∂ΔH/∂T is non-zero, and equal to ΔCp (i.e. the difference in heat capacities between the reactants and products.
    Heat capacity is a function of temperature (Sal does not appear to have ever learnt this…). This means that ΔCp may or may not be a function of temperature
    (imagine that the reactants and products have heat capacities of {r + a.lnT} and {p + a.lnT} respectively; they both vary with temperature, but their difference does not).
    Under the assumption that ΔCp is constant, then “ΔH will vary linearly with T”. Sal is failing high school math…if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”.
    He actually is confusing the symbols ∂ and Δ.

    Meh, I think if you explained this to him in person he’d get it. And you’d probably have a better sense where he’s coming from.

    Even people who’ve gotten stuff wrong on high school math tests need not be thought to be from other planets. I have a daughter in high school, and some of that stuff is really hard!

  30. DNA_Jock,

    See the derivation of dS here:

    https://engineering.ucsb.edu/~shell/che110a/heatcapacitycalculations.pdf

    dS = Cp dT/ T

    which implies

    dS/dT = Cp/T

    which means entropy change is dependent on temperature if ΔH changes linearly with T.

    dS/dT = Cp/T

    falsifies this absurdity by you:

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    You got caught making a mistake, and you won’t back down. I’ll keep reminding you of it.

    Sal, read to the end of my comment before responding…

    No point since your first option was wrong, namely:

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    look at dH/T = Cp dT/T

    which implies

    dH = Cp dT

    this will be a linear relationship if Cp is a constant, and the letter “C” often means constant, and Cp is in fact:

    ∂H/∂T = Cp

    as stated here

    https://en.wikipedia.org/wiki/Heat_capacity

    dH = Cp dT

    implies the linear relationship

    ΔH = Cp ΔT

    which looks a lot like

    y = mx + b

    where
    y is ΔH
    m = Cp
    x = T
    b= 0

    given all these facts,

    dS/dT = Cp/T

    which falsifies your claim. Go ahead DNA_jock, do a Keiths and keep repeating yourself, it won’t change the fact your math is wrong and this statement is absurd:

    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.

    Wrong, entropy change for “ΔH will vary linearly with T” implies entropy is temperature-DEPENDENT not independent. It’s right here:

    dS/dT = Cp/T

    The change in entropy (dS) with respect to change in temperature (dT) is dependent on temperature (T).

    No amount of verbosity on your part will change that. I’ve provided ample links supporting my derivation, and you only reference yourself for the most part.

  31. jock: Heat capacity is a function of temperature….This means that ΔCp may or may not be a function of temperature (imagine that the reactants and products have heat capacities of {r + a.lnT} and {p + a.lnT} respectively; they both vary with temperature, but their difference does not).

    Under the assumption that ΔCp is constant, then “ΔH will vary linearly with T”….if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”.

    sal: look at dH/T = Cp dT/T

    which implies

    dH = Cp dT

    this will be a linear relationship if Cp is a constant, and the letter “C” often means constant, and Cp is in fact:

    ∂H/∂T = Cp

    dH = Cp dT

    implies the linear relationship

    ΔH = Cp ΔT

    Wrong, entropy change for “ΔH will vary linearly with T” implies entropy is temperature-DEPENDENT not independent. It’s right here:

    dS/dT = Cp/T

    The change in entropy (dS) with respect to change in temperature (dT) is dependent on temperature (T).

    ***************************************************
    Hmmmmm.

    IS vs. May Or May Not Be

    I think It may be time to procure seconds and start talking about how many paces until one may fire one’s pistol.

  32. Sorry walto,

    But Sal still hasn’t figured out that my three options were options that Sal could potentially use to do his Sal-calculation, nor has he fathomed the difference between ∂H/∂T and ∂ΔH/∂T, or that Cp is not a constant. Apparently he thinks it is constant, because it starts with the letter “C”.
    You really couldn’t make this stuff up.

  33. Cp is not a constant

    It is if ΔH varies linearly with T.

    Linear variation is described by:

    y = mx + b

    where m is a constant

    let

    y = ΔH
    m = Cp
    x = T
    b= 0

    Thus,

    y = mx + b

    becomes this

    ΔH = CpT

    therefore, Cp must be a constant. If Cp is a constant, ΔCp = 0, just as I said.

    Cp is not a constant

    It is under option A where DNA_Jock said:

    ΔH will vary linearly with T.

    I showed linear variation of ΔH implies Cp is constant and thus ΔCp = 0.

    ΔCp = non-zero for option B. DNA_Jock is conflating all 3 options, when I’ve provided ΔCp for option B as

    ΔCp = non zero

    and for option A as

    ΔCp = 0, which implies Cp = constant

  34. Walto:

    Sal, I have the sense that you and Jock are so close at this point that if you went to lunch and brought a tablet and pen you could iron your differences out in minutes.

    Well, thanks for the positive thoughts anyway. To quote Rodney King during the LA riots in his honor, “Can’t we all just get along.”

    To which I respond, “but this is TSZ”.

    Anyone with requisite knowledge can examine the math and computations. I’ve tried to be verbose and give many examples to drive home my points.

    This is an example of a tiresome misrepresentation:

    But Sal still hasn’t figured out that my three options were options that Sal could potentially use to do his Sal-calculation, nor has he fathomed the difference between ∂H/∂T and ∂ΔH/∂T, or that Cp is not a constant

    Cp is constant if ΔH varies linearly with T (option A). That is his premise for option A, not mine.

    Cp is not constant otherwise (as would be the case in general with option B).

    DNA_Jock isn’t correcting me, he’s attributing ideas to me that I did not make nor intended to make. As long as he’s willing and eager to do that, we won’t cooperate.

    Option A and Option B were not options in ways to calculate entropy but entirely different thermodynamic systems. I provided the derivations above demonstrating Option A implies a constant Cp.

  35. walto:

    What is the entropy of a tempest in a teapot?

    A comment worth repeating:

    walto,

    It’s a bit odd for you to be arguing that this is all just a quibble. You certainly didn’t think so earlier in the thread, when you characterized it in these breathless terms:

    Rather than giving opportunities, you might consider giving a single reason for the entire scientific establishment to shift its paradigm.

    And:

    Quick, rewrite all the science books!

    And now that you’ve reversed yourself on energy dispersal and the missing information view, why not take the final step and acknowledge your error regarding observer dependence?

  36. Mung,

    While the Sal/Jock sideshow continues, perhaps we can make some progress in our discussion of the dice. Do you understand why you need an epistemic probability distribution over the microstates, not over the sums, in order to do the calculation you are attempting?

  37. Another demonstration that Cp is constant for Option A.

    ” ΔH varies linearly with T”

    follows the form

    y = mx + b

    where m is a constant. To emphasize m is constant, I will say m_constant.

    let

    y = ΔH
    x = T
    m = m_constant
    b = some other constant

    ΔH = m_constant T + b

    which if we adjust b it is

    H = m_constant T + b

    now the definition of Cp is

    ∂H/∂T

    from
    https://en.wikipedia.org/wiki/Heat_capacity

    taking the partial derivative with respect to T

    Cp = ∂H/∂T = m_constant

  38. keiths,

    As alredy explained (twice? thrice?) if one didn’t understand that this was a quibble (as you apparently still don’t) and one wnate to insist on the thermodynamic sense of your absurd Damon entropy calculations, one WOULD have to rewrite all the books to make them useless.

    So your choice is clear–start rewriting or start to understand the entirely quibbling nature of your thousand posts on this thread.

  39. keiths: While the Sal/Jock sideshow continues,

    Sorry, but it’s your blind dice-thrower event that’s the sideshow.

    Let ‘er rip!

Leave a Reply