The basic biochemistry textbook I study from is *Lehninger Prinicples of Biochemistry*. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system,

entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, GlossaryLaurence A. Moran, University of Toronto

H. Robert Horton, North Carolina State University

K. Gray Scrimgeour, University of Toronto

Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conﬂates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, ﬁre and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the ﬁrst conﬂation to be an error.

…

“Disorder” is an analogy for entropy, not a deﬁnition for entropy. Analogies are powerful but imperfect.When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?

Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga

2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very

ordered— and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itselfit rapidly proceeds to the disordered most probable state.”

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:

EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy

…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.

..

Lehninger, Nelson and Cox

….

Garrett and Grisham

…

Moran and Scrimgeour

….

Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

…

A few creationists got it right, like Gange:

Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy issimply a quantitative measure of what the second law of thermodynamics describes:the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly,entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where

S = entropy

k = boltzman’s constant

W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where

delta-S = change in entropy

dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

I also liked Larry Moran’s remark. It made clear from the outset of this thread that this dispute is 1) old news, and 2) not terribly important.

And now some more humiliation for DNA_Jock:

ΔH = Lf = L_effecitve = = L_f(Tm) – Integral [(c_w – c-I) dT]

So what did DNA_jock have to say about ΔH?

If that were true as DNA_Jock claims, then the graph of L_effective (which is the theoretical value of ΔH from the Kirchoff relations of thermochemistry) would be a straight line.

I told DNA_Jock from the very first that was a ridiculous claim since ΔH vs. T won’t be a straight line. So now I invite the readers to examine for themselves if the L_effective curve (which is the theoretical ΔH curve) is in fact a straight line. I took the liberty of drawing a straight solid blue line to contrast it with the curved solid black line of L_effective (the theoretical ΔH from Kirchoff relations). Is the L_effective curve a straight line? I think not.

Of course the painful way to demonstrate this is to take the first derivative with respect to T and see if we get a constant. Not likely.

ΔH = Lf = L_effecitve = = L_f(Tm) – Integral [(c_w – c-I) dT]

But we don’t have to go through that agony, since a picture is worth a thousand words and we can clearly see the L_effecitve curve is not a straight line, therefore DNA_jock’s claim ” ΔH will vary linearly with T” is false.

Talk about bringing a gun to knife fight and then shooting yourself in the head DNA_Jock. Too funny. Ah, the gift that keeps on giving. Hehehe!

Oh Sal, I never made any such claim. I wrote:

to which you responded:

Each and every time that you state this blatant falsehood, it makes you look bad.

Come on, Sal. DNA_Jock strikes me as taking a generally reasonable approach with you. I’m sure communication would improve if you could resist succumbing to these little outbursts of hubris.

I disagree with Salvador. If you adopt the information theory approach there is something in our everyday experience to familiarize it [entropy] to our intuitions.

I disagree with Salvador. I think he takes an overly simplistic view of counting in this context. It’s not something we do on our fingers.

I disagree with Salvador. I’ve posted a number of texts in this thread that use the information theory approach. It’s been in use for decades.

Sal,

Those aren’t metaphors. The people who take the “disorder” view believe that entropy is an

actualmeasure ofactualdisorder. The people who take the “energy dispersal” view, including Lambert and “Dr. Mike”, believe that entropy is anactualmeasure of theactualdispersal ofactualenergy. The people who take the “missing information” view, including me, hold entropy to be anactualmeasure of theactualinformation gap between the macrostate and the microstate.They aren’t metaphors; they are

characterizationsof entropy.The first two characterizations are incorrect. Entropy is not a measure of disorder, and it is not a measure of energy dispersal. It

isa measure of missing information.You rejected the “entropy as disorder” view in your OP, using words like “Gag!” and “Choke!”, and urged us to embrace the “energy dispersal” view instead, not realizing that the latter was mistaken.

In other words, you unwittingly urged us to replace one misconception with another. Don’t you see the irony, particularly when there is a third alternative — the missing information view — that is actually correct?Sal,

Entropy

isan objective property of the macrostate, but the macrostate itself is observer-dependent.To the extent that their macrostates match, people will agree on the entropy value. To the extent that their macrostates mismatch, they will disagree.

When scientists agree on the macrostate, as they do when they compile a table of standard molar entropies, for instance, they will get the same entropy values.

walto,

No, not at all. Disorder and uncertainty are separate concepts.

Entropy is a measure of uncertainty (or equivalently, of missing information). It is not a measure of disorder.

And what does disorder mean?

I’m all mixed up!

walto,

LMGTFY

walto, today:

walto, Tuesday:

You have a lot of confusion to sort through, walto.

500 coins all heads up on a table is more ordered than 500 coins tossed in the air. I think this is one of those things Sal is trying to teach budding young IDists.

In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system.

In condensed matter physics, systems typically are ordered at low temperatures; upon heating, they undergo one or several phase transitions into less ordered states.

Examples for such an order-disorder transition are:

the melting of ice: solid-liquid transitionhttps://en.wikipedia.org/wiki/Order_and_disorder

I just had to laugh at that one.

Based on your last two posts you understand even less than I thought. And that wasn’t much to begin with!

Yes, you are extremely confused. But I’m tired of trying to sort through this for you.

Which is it, walto?

walto, today:

walto, Tuesday:

Right. The absence of “some symmetry or correlation.” Exactly. That’s what would allow for the determination of the the exact microstate. Explain this to keiths, will you? I’m too tired.

That you think there’s a contradiction there is exactly what you’re missing. It’s funny. But as I told both you and mung, I’m too tired to spend any more time with your silliness.

walto,

Here’s a hint: Both of your statements are false.

For Salvador, here’s how professionals count:

https://www.probabilitycourse.com/chapter2/2_1_0_counting.php

keiths:

Mung:

Sure you did, just yesterday:

Mung:

And then you contradicted yourself, claiming that you were right all along, as I just showed. You’re making stuff up as you go, trying to hide your mistakes.

Your childishness is tiring and a waste of time, Mung. You made a mistake. I pointed it out. Deal with it and move on.

Mung, quoting Wikipedia:

walto:

No, walto. The presence or absence of symmetry or correlation is

notwhat allows for the determination of the exact microstate.Flip 500 distinguishable coins randomly, without looking at the result. The entropy — the amount of additional information needed to determine the exact microstate — is the same regardless of how the coins land. There are 2^500 possible microstates, each with the same epistemic probability. The entropy is therefore

S = log2 2^500 = 500 bits

…and that remains true whether the microstate is highly symmetric and correlated — say, all heads — or a jumbled and disordered mix of heads and tails.

walto, Sal, Mung:

Here’s why entropy is observer-dependent:

If you disagree, then let’s hear your analysis of my earlier scenario involving a pair of dice:

A blindfolded walto randomly throws a pair of fair dice, one red and one green. The exact microstate is some ordered pair (

r,g), whereris the number on the red die andgis the number on the green.Like walto, I am blindfolded. Neither of us has any information about the result of the dice throw.

An honest, accurate witness whispers to walto that the sum of the dice is greater than eight. To me she whispers that the sum of the dice is greater than ten.

For walto, the ten possible microstates are

(3,6)

(4,5), (4,6)

(5,4), (5,5), 5,6)

(6,3), (6,4), (6,5), (6,6)

For me, the three possible microstates are

(5,6)

(6,5), (6,6)

We have different macrostates, but both are correct. It is

truethat the sum is greater than eight, and it is alsotruethat the sum is greater than ten.Based on his correct macrostate, walto computes an entropy of

S = log2 10 ≈ 3.32 bits

Based on my correct macrostate, I compute an entropy of

S = log2 3 ≈ 1.58 bits

Both of us are correct. The macrostates are observer-dependent, and so is the entropy.

Probability theory as logic shows how two persons, given the same information, may have their opinions driven in opposite directions by it, and what must be done to avoid this.– E.T. Jaynes

An argument which makes it clear intuitively

whya result is correct is actually more trustworthy, and more likely of a permanent place in science, than is one that makes a great overt show of mathematical rigor unaccompanied by understanding.– E.T. Jaynes

Indeed. Math is just modelling.

walto:

In this scenario, what is the “one correct macrostate”? What are the associated microstates?

I’d love to answer your questions. I’d also love for you to answer mine. What do you say? Seems fair to me.

What is her macrostate? Is it also the correct macrostate?

ETA: It seems to me that in this scenario we only have one observer. What exactly are you and walto observing?

Macrostate : Microstates : Haw Entropy

(1,1) : (1,1) : 0

(1,2) : (1,2) : 0

(1,3) : (1,3) : 0

(1,4) : (1,4) : 0

(1,5) : (1,5) : 0

(1,6) : (1,6) : 0

(2,1) : (2,1) : 0

(2,2) : (2,2) : 0

(2,3) : (2,3) : 0

(2,4) : (2,4) : 0

(2,5) : (2,5) : 0

(2,6) : (2,6) : 0

(3,1) : (3,1) : 0

(3,2) : (3,2) : 0

(3,3) : (3,3) : 0

(3,4) : (3,4) : 0

(3,5) : (3,5) : 0

(3,6) : (3,6) : 0

(4,1) : (4,1) : 0

(4,2) : (4,2) : 0

(4,3) : (4,3) : 0

(4,4) : (4,4) : 0

(4,5) : (4,5) : 0

(4,6) : (4,6) : 0

(5,1) : (5,1) : 0

(5,2) : (5,2) : 0

(5,3) : (5,3) : 0

(5,4) : (5,4) : 0

(5,5) : (5,5) : 0

(5,6) : (5,6) : 0

(6,1) : (6,1) : 0

(6,2) : (6,2) : 0

(6,3) : (6,3) : 0

(6,4) : (6,4) : 0

(6,5) : (6,5) : 0

(6,6) : (6,6) : 0

Average HawEntropy = 0

There’s something wrong with this picture.

What would the average entropy be for keiths and walto?

Muing,

The journey needs to be guided by someone who knows the territory, not someone who is lost. You are confused about entropy, and I’m trying to lead you to understanding.

When you ask questions that I judge to be useful and relevant, I’ll answer them. If you challenge my position with an actual argument, I’ll respond. Otherwise, no, I’m not interested in answering every question you raise. There’s a reason the teacher sets the syllabus, not the student.

keiths:

Mung:

She knows the exact microstate, whatever it happens to be.

There is no such thing as “the correct macrostate”. That’s walto’s mistake.

Her macrostate is

acorrect macrostate, notthecorrect macrostate.Since she knows the exact microstate, only one microstate is epistemically possible for her. She sees an entropy of

S = log2 1 = 0 bits

Walto and I are observing the system. It’s just that we’re observing it indirectly via the witness. From our perspective, she is an indirect source of information about the system, just as a thermometer might be an indirect source of information in a thermodynamic scenario.

She is also an observer in her own right, of course.

What a load of utter bullshit, oh wise guru. Keep leading him!

walto:

In this scenario, what is the “one correct macrostate”? What are the associated microstates?

walto,

Your position makes no sense. If the “one correct macrostate” were the one corresponding to a single microstate — the actual microstate of the system — then the entropy would always be zero.

And so it is.

My answer is the same as walto’s. The correct macrostate is the one observed by your “honest, accurate witness.” After all, she must have some basis in fact for telling walto that the sum is greater than eight and telling you that the sum is greater than ten, and this can only come from knowing the precise macrostate.

So according to you, entropy is always zero, rendering it useless.

That’s a hint that there’s something

verywrong with your understanding of entropy.Entropy is a measure of missing information, remember? If she’s got all of it, there’s none missing.

walto,

Right. She isn’t missing any information about the microstate, so the entropy is zero for her. She knows the exact microstate, so

S = log2 (1) = 0 bits

I’m missing

someinformation about the microstate. I know that the sum is greater than ten, so I can narrow things down to three possible microstates, but I don’t know which of the three is the actual microstate. I see an entropy ofS = log 2 (3) ≈ 1.58 bits

You are missing the most information about the microstate. You know only that the sum of the dice is greater than eight. So for you, the entropy is higher than for me — you can only narrow things down to ten possible microstates. You see an entropy of

S = log 2 (10) ≈ 3.32 bits

All three macrostates are correct, and all three entropy values are correct. They differ because each of us is missing a different amount of information about the exact microstate.

Macrostates are observer-dependent because the missing information can differ from observer to observer. Entropy is a function of the macrostate, so it too is observer-dependent.

Your mistake is in thinking that there is “one correct macrostate”. There isn’t.

The witness knows the exact microstate, and she’s right about that. I know that the sum is greater than ten, and I’m right about that. You know that the sum is greater than eight, and you’re right about that. All three macrostates are correct, and so are the corresponding entropy values.

Macrostates are observer-dependent, and therefore so is entropy.One problem is that no one knows the precise one and only correct microstate from the macrostate, not even an honest accurate witness.

How does Ms. Haw know all the possible macrostates, given that she only knows the exact microstate? Who told her that dice sums were legitimate macrostates? Why are dice sums the only legitimate macrostates?

Mung,

Sure they do. Any normally-sighted person can discern the microstate in this case. All you have to do is read the numbers off the dice and distinguish red from green. Any person capable of doing so can be a Damon in the world of dice entropy, in other words. If I removed my blindfold I could do so as well, and so could walto, assuming that he is capable of reading dice numbers and distinguishing red from green.

She doesn’t need to know all the possible macrostates, and in any case I am assuming that she’s smarter than you.

Again, I’m assuming that she’s smarter than you and was able to figure that out on her own.

They aren’t, as I’ve already explained.

Yes, there is.

Sure, if you’re not wearing a blindfold.

keiths:

walto:

If you believe that, then you don’t grasp the concept of a macrostate.

To understand entropy, you need to understand macrostates.

To understand entropy, you need to understand knowledge.

To understand knowledge you need to understand truth. 😉