The basic biochemistry textbook I study from is *Lehninger Prinicples of Biochemistry*. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system,

entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, GlossaryLaurence A. Moran, University of Toronto

H. Robert Horton, North Carolina State University

K. Gray Scrimgeour, University of Toronto

Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conﬂates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, ﬁre and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the ﬁrst conﬂation to be an error.

…

“Disorder” is an analogy for entropy, not a deﬁnition for entropy. Analogies are powerful but imperfect.When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?

Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga

2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very

ordered— and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itselfit rapidly proceeds to the disordered most probable state.”

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:

EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy

…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.

..

Lehninger, Nelson and Cox

….

Garrett and Grisham

…

Moran and Scrimgeour

….

Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

…

A few creationists got it right, like Gange:

Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy issimply a quantitative measure of what the second law of thermodynamics describes:the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly,entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where

S = entropy

k = boltzman’s constant

W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where

delta-S = change in entropy

dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

Mung,

No they don’t, and they don’t need to.

Let

rbe the number on the red die andgthe number on the green. The following are all legitimate macrostates:1.

randgare the last two digits, in order, of the year in which the Japanese surrendered to the US at the end of WWII.2.

gis prime.3.

rraised to thegis exactly 32.4.

gminusris exactly one.Suppose I don’t know how honest my friend is, or whether she has a good vantage and good light when she divulges this apparent information to me. Or maybe she’s tired or tired of this game. What are the exact macrostates in those cases? Is there no exact macrostate in all those cases? In fact, as macrostates seem to be a function of knowledge in your view, is there never an exact macrostate for one who doesn’t believe people ever know anything but can only sometimes know* them? (And does it matter that “know*” can’t actually be defined without ellipses, and nobody is in a position to tell whether they ever actually know* anything, but can only sometimes know* that they don’t know things?)

walto,

Did you miss the “accurate and honest” part? You even quoted it:

Yeah I noticed that. Now answer my questions.

Learn to read for comprehension, walto.

I’m comprehending now that you can’t answer my questions.

“If a tree falls in a forest and no one is around to hear it, does it make a sound?”. Perhaps macrostates are simply a human mathematical construct and only microstates exist.I think they supervene on microstates. But that doesn’t entail that they change when we’re drunk or sleepy–any more than, say, apples do when they look blurry.

Just because entropy is a measure of paucity of knowledge, doesn’t mean macrostates exist only in the mind. What we KNOW about them exist only in minds.

When trees fall in the forest, they make sounds. You can leave your phone there to record them if you like. Making a sound isn’t the same thing as being heard.

FWIW,

A good example of the main macrostates of interest are:

Temperature

Volume

Quantity of Substance

Chagnes in temperature are used to determine heat flow for calorimetry from which we determine dQ (heat transferred in or out of a system).

So those are the 3 major macrostates for thermodynamics. There are probably others, but many of the examples I worked out in this discussion use only those three as illustrated in the orange cells of my Sakure-Tetrod Excell spreadsheet:

http://www.creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

Incidentally, I think it is ridiculous all the criticism of the Claussius notion of entropy. Much of real world entropy measurements are made in terms of the macrostate variables used by Claussius of:

Temperature

Quantity of Substance

Volume

It’s not made via direct counting of microstates or “missing information” meters! We usually infer the number of microstates or “missing information” by taking the 3 macrostate variables and computing the number of microstates or “missing information.”

As I said, the “missing information” definition is not of much practical use. The computation of the number of microstates is useful for estimating molecular structure based on macrostate data — an example was Pauling using macrostate data to determine the approximate molecular structure of ice. But one doesn’t usually calculate microstates first because one needs to know molecular structure in advance to do the computation, and that structure is usually what’s in question, not the measured Claussius entropy!

Sal,

The “entropy as missing information” view is correct. The “entropy as disorder” and “entropy as energy dispersal” views are incorrect. They don’t work.

The choice is easy, unless you’re Sal.

On Mung’s thread, Patrick writes:

Mike asks:

No one here believes that entropy is a measure of disorder, as far as I can tell. Regarding information, the answer is simple, and it’s the same one I’ve been giving throughout the thread: Entropy is a measure of missing information — the gap between what the macrostate tells you and what is needed to pin down the exact microstate.

The problems with the “entropy as energy dispersal” view have already been pointed out.

And there you have it, patrick! Give him a harder one!

keiths,Can you challenge Sal’s argument that the missing information definition is not of much practical use?

In one of the few time I un-ignored by logging out, I ventured to look at Keiths and Mungs latest drivel.

I never said the “missing information” definition is inherently wrong, it’s just about next to useless.

How does Keiths calculate entropy of a melting ice cube without using those equations NOT framed in terms of missing information — like, eh, dQ/T!

Keiths is relegated to using equations like

delta-S = Integral (dQ/T)

that have no regard for information theory in order for Keiths to calculate his missing information. Too funny. He proves again that near lack of any utility of this missing information definition.

I was one of the first at TSZ to relate formally information theory Shannon bits to Joule/Kelvins explicitly. If bits were useful, they’d be used as a measure of entropy in industry, but they are not. They are usually in Joule/Kelvin to reflect the importance of the Claussius definition for utility, not theoretical abstractions.

The Boltzmann definition is important as it allows Claussius measurements to give possible insight to atomic and molecular details (like Pauling deducing the molecular details of ice).

Let Keiths cite a major industrial application in chemistry or material science or protein science that uses his “missing information” definition and does the calculation in bits. In contrast I could site thousands of papers that are rooted in the Claussius definition (aka energy dispersal).

How about Keith show where the Gibbs free energy equation is used by chemists with the “missing information” definition. Absurdity!

How about Keiths start with college enegineering, chemisty, and physics textbooks.

Let Mung calculate the entropy of a melting ice cube using his missing information approach instead of the Claussius approach he disparages.

Keiths is relegated to using equations that don’t have regard to “missing information” in order to calculate the “missing information” he’s so fixated on. Too funny.

Despite being told this repeatedly, he can’t seem to realize this is testament to the fact “missing information” is a superflous and gratuitous add-on of little utility except to Keiths and Mung to have excuses to bloviate about stuff they can’t even themselves calculate from their own definitions.

Calculate the entropy of melting ice in such a way that shows the necessity of information theory. Can’t be done, because the counter example of

dS = dQ/T

suffices, and it has no need of information theory. Hence, information theory is not absolutely essential to define entropy in practical applications.

colewd,

Sure. Sal’s making the same mistake he’s been making throughout the thread.

How you define entropy does not determine how you measure it.Those are separate questions.Sal’s babbling about “missing information meters” is just that — babbling.

You’ll notice that he keeps trying to portray this as some kind of contest between Clausius and Boltzmann/Gibbs entropies. It’s not. As Sal himself has pointed out, they’re linked. Statistical mechanics provides the theoretical underpinning that classical thermodynamics lacks. Statistical mechanics

explainsClausius entropy.colewd,

Why are you afraid to quote this comment to Salvador?

You don’t have much confidence in him, do you?

Claussius defined entropy through:

ds = dQ/T

He also was the one who coined the word “entropy”, not Shannon, not Boltzmann!

You got a problem with that?

That definition doesn’t use “missing information” notions. Keiths can’t get it through his head one definition (Claussius) doesn’t exist to the exclusion of other definitions (Boltzmann, Shannon, etc.)

Sal,

Have you taken me off ‘ignore’? 🙂

So Keiths, show the readers how you calculate the “missing information” entropy of a 20 gram melting ice cube and get

24.39 J/K

What? You have to use the

dS = dQ/T

definition which has no reference to your “missing information” definition, in order to compute your “missing information” entropy? Too funny.

Like colewd pointed out, you have yet to prove the practical utility of the “missing information” vs. the Old School definition defined in terms of heat and temperature, as in:

dS = dQ/T

where

S = entropy

Q = heat

T = temperature

Claussius coined the word “entropy”, so don’t be too quick to diss his definition, especially since the Keiths approach can’t answer trivial entropy problems without resorting first to the Claussius definition.

Sal:

Science progresses, Sal. We now understand that temperature corresponds to average kinetic energy, and that entropy is a measure of missing information. You yourself have admitted that the “missing information” interpretation is correct.

All that’s left is for you to acknowledge that the “energy dispersal” view of entropy is incorrect.

Poor Salvador has absolutely no answers, not even ones that ought to be very easy for him. He should go back to Ignoring us.

That’s the

outcomeof the experiment. I would not call it a microstate, though I might be willing to take the sum of the two dice and call that a macrostate. But I don’t think it matters, because it’s not relevant to calculating the entropy.Patrick asks:

My position is

notthat entropy is subjective, but rather that it is observer-dependent. It’s a measure of the gap between the macrostate — what an observer knows about the exact state of the system — and what would be required to pin down the exact microstate.Can you think of any scenario, including Mike’s, in which entropy is

nota measure of the missing information as described above? If so, please present it along with an explanation of why you think the missing information interpretation cannot successfully be applied to it.Heck, for me, it’s as if the dice haven’t even been tossed at all, right?

I disagree. I gave you the value I think is correct for the entropy. The

entropyis 3.27 bits/symbol.There are 36 ways to arrange the two dice, but that doesn’t make the probability distribution uniform, and the entropy is calculated from the distribution.

Mung,

Sure it does. If the dice are fair and thrown randomly, as we’ve specified, then the 36 microstates are equiprobable.

keiths:

Mung:

Why not?

keiths:

Mung:

You had it right originally, but you changed your answer.

What you’re quoting there — the 3.27 bits/symbol number — is the average number of bits per symbol for a code where each symbol represents the

sumof two fair dice thrown together.keiths,I understand Sal’s arguments, however I am struggling with yours. I understand defining entropy as missing information, however I still have no idea what you do with the understanding of how much missing information is there from a observers point of view. I keep asking because it maybe just my lack of understanding.

colewd:

We use it in all of the ways we currently use entropy. Nothing changes except our understanding of what entropy actually is. It isn’t a measure of disorder, and it isn’t a measure of energy dispersal. It’s a measure of missing information, which means that it can vary from observer to observer. The Second Law still holds, the world continues to turn, everything is fine — we’ve just jettisoned an incorrect understanding of entropy and replaced it with an accurate one.

Because we’ve already agreed there are 36 ‘microstates’, not one. Tossing a pair of dice does not change the probability of rolling a seven from 6/36 to 36/36.

Why not call “there’s a six on the red die and a three on the green die” the macrostate?

By the way, in the game of craps, the dice are the same color, and the color doesn’t matter when calculating the odds, the payoffs, or the expected value from any given wager. But you know that, I think.

🙂

Yes. Exactly. Thank you for listening.

I know the figure and I know how to calculate it and I know why I am performing the calculations. And that’s how I am deciding to calculate the entropy. The Shannon entropy.

Why am I wrong?

I don’t understand Salvador’s arguments, and I am a bona fide genus.

Salvador says:

Yeah, I’ve been pointing out the relevance of the macrostate variables throughout this thread. Somehow Salvador manages to turn that into a claim that there is “criticism of the Claussius notion of entropy” going on.

I don’t understand that argument. Do you?

keiths:

keiths:

Mung:

keiths:

Mung:

Yes, and (red=6, green=3) is one of them. It’s a microstate.

This is not that difficult, Mung.

Mung,

Because you’ve been cribbing from a source that you don’t understand.

As I said:

We are dealing with a situation where two microstates — say, (red=5, green=2) and (red=3, green=4) — are distinct even if they sum up to the same value — in this case 7.

Oh good. So you won’t be telling me it is THE microstate. It’s one of 36 possible microstates and that doesn’t change just because I tossed a pair of dice and someone else observed the outcome.

The

entropy(uncertainty) associated with the outcome of the toss of a pair of dice has not changed just because a pair of dice was tossed.By the way, why not try to teach your concept of entropy from the toss of a single coin or the toss of a single die, or some other scenario where all outcomes are equally likely?

Sigh.

Read it again, Mung:

I’m sorry, but that answer just doesn’t cut it. You seem to agree with the value I arrived at and the process of reasoning I used to arrive at that value. So all that’s left to dispute is whether the value given is the Shannon entropy.

For that you need to make an argument.

You could start by contributing your own code and tests for calculating Shannon entropy in the Dice Entropy thread, or at least explaining there why mine is wrong.

I understand the requirements well enough to code them. You need to explain why my code does not calculate Shannon entropy.

Mung,

Right. What reduces the entropy is not that the dice have been tossed, but rather that your friend has given you feedback about the microstate. For example, if she (accurately and honestly) tells you that the number on the red die is greater than the number on the green die, then you have gained information about the actual microstate, because some of the 36 microstates have been ruled out. It is this gain in information — a reduction in the missing information — that constitutes a reduction in the entropy.

Um, that’s what I’m doing. When you randomly toss two fair dice, each of the 36 microstates is equally likely. Each has a probability of 1/36.

Slow down and think about this for a while, Mung.

Yes, if the ‘microstates’ were not distinct they would not be distinct “microstates”. And the ‘microstates’ that do sum up to the same value are not the same, just as in actual thermodynamics not all microstates are equally probable.

You need to find a reason to reject the probability distribution I have chosen based upon the information I have. For example, I know that 5+2=7 and I know that 3+4=7, and I know that there are four other combinations of a pair of dice that can sum to 7 ({1,6},{6,1},{2,5},{4,3}).

By the way, my version of the entropy of a pair of dice isn’t observer-dependent. Is that why you object to it so much?

LoL. What you are telling me is that I need to

recalculatethe entropy when someone else tells me what they observed.Slow down and think about this for a while, Keith.

Was my initial calculation wrong? Was the observation I used to calculate the entropy wrong? Was Xavier wrong? Was Yolanda wrong? Was Damon wrong?

None of them were wrong.

Seriously keiths, do you not see the irony?

Mung,

Entropy is a measure of missing information, so

of courseit changes when you gain relevant information. The missing information is reduced, and therefore so is entropy.Mung,

Think,Mung. The fact that entropy is observer-dependent does not mean that it cannot be miscalculated.So all sums of a pair of dice are not equally probable, and not everyone knows this. But the sums of a pair of dice are available to all (and objective) and that the sums are not all equally probable is apparent to anyone who has played the game of craps (objective).

So why is someone who does not know that all sums of a pair of dice are not equally probable

wrongwhen they calculate the entropy?My initial calculation of the entropy was based on my lack of information regarding whether the values could be summed. Silly me. Was I wrong?

That the values can be summed hardly seems to be observer-dependent, and what those sums are hardly seems to be observer-dependent. Nor is the probability distribution observer-dependent.

I gained more information, and my calculation of the entropy was reduced.

So why was my new calculation of the entropy “wrong” when my initial calculation of the entropy was “right”?

Xavier, Yolanda, Mung, Damien.

Yolanda, Xavier and Damien all managed to be “not wrong” when they calculated entropy. You need to present an argument for why my calculation is wrong.

So? It’s based on the available information. How is it a miscalculation?

Mung,

There are 36 possible microstates. Each microstate is an ordered pair of two numbers,

randg, whereris the number on the red die andgis the number on the green die.(3,4) is a microstate. 7 — the sum of

randg— is not a microstate.To calculate the entropy, you need an epistemic probability distribution over the microstates. Instead, you are using an epistemic probability distribution over the sums.

You are solving the wrong problem.I never claimed that the

sumis a microstate.There are 36 possible ‘microstates’. All 36 possible ‘microstates’ are equally probable. But the sums of the values are not equally probable. Every craps player knows this.

Given that we can calculate the sums we can also determine the probability distribution. Give that we can determine the probability distribution we can calculate the entropy.

This leaves you off somewhere in no mans land.

My position is objective, and is not “observer-dependent.”

Mung,

If you want to calculate the entropy, you need an epistemic probability distribution over the microstates, not over the sums.

If you can’t understand macrostates, microstates, and the roles that they play, you are never going to get this.

But not to the exclusion of other definitions like you’re doing, in fact, it is highly important entropy can be defined in disparate ways. There are some classic analogies of disparate definitions of the same thing.

i.e.

Energy can be equated with mass times the speed of light squared:

E= mc^2

or optics related to electricity and magnetism through Maxwell’s Equations:

https://en.wikipedia.org/wiki/Maxwell%27s_equations

Or Entropy defined in terms of heat and temperature (Claussius) related to the motion of molecules and the possible microstates they can be found in (Boltzmann/Gibbs):

delta-Q = Inetegral (dQ/T) = kB (ln W_final – ln W_initial)

The above relation has a lot of physical significance, but tacking on Shannon:

delta-Q = Inetegral (dQ/T) = kB (ln W_final – ln W_initial) =

I = (ln W_final – ln W_initial) / kB / ln(2)

is superfluous transformation to a different log base and dividing it by a constant. It is little more than a conversion factor of Boltzmann’s original equation.

Worse, your lack of responses has demonstrated you can’t even calculate “missing information” most of the time without resorting to Claussius:

dS = dQ/T

despite your boasting that science has moved on. Indeed science has moved on, so it is all the more amazing the Claussius definition is still essential to computing Boltzman and Shannon microstates of W in most cases.

You’ve yet to show you can compute Shannon for a simple system like a melting ice cube. That’s because your approach has next to no practical utility, like I said. If it did, you’d have provided a worked out example using your approach rather repeatedly embarrassing yourself with a lack of ability to calculate “missing information” without first resorting to Claussius definition that has no regard for your definition of entropy in terms of “missing information”.

Go ahead, show a practical computation of entropy of a melting ice cube, a 1555 gram block of copper, the standard molar entropies of common chemical substances, the Gibbs free energy, the absolute entropy of a gas, etc. using your “missing information” approach. Many of these I worked out through energy dispersal, which is formally:

dS = dQ/T

which is more than I can say for what you and Mung and DNA_jock have done in this thread after almost 1,500 comments and counting.