The basic biochemistry textbook I study from is *Lehninger Prinicples of Biochemistry*. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system,

entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, GlossaryLaurence A. Moran, University of Toronto

H. Robert Horton, North Carolina State University

K. Gray Scrimgeour, University of Toronto

Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conﬂates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, ﬁre and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the ﬁrst conﬂation to be an error.

…

“Disorder” is an analogy for entropy, not a deﬁnition for entropy. Analogies are powerful but imperfect.When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?

Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga

2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very

ordered— and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itselfit rapidly proceeds to the disordered most probable state.”

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:

EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy

…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.

..

Lehninger, Nelson and Cox

….

Garrett and Grisham

…

Moran and Scrimgeour

….

Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

…

A few creationists got it right, like Gange:

Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy issimply a quantitative measure of what the second law of thermodynamics describes:the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly,entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where

S = entropy

k = boltzman’s constant

W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where

delta-S = change in entropy

dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

Which one?

Just when things were starting to become clear. =P

Poor Sal. This discussion has been going on for almost a month, and he still doesn’t get that the dispute is over the

interpretationof entropy, not its calculation.colewd,

Since Sal has put himself in the awkward position of ignoring (or pretending to ignore) my comments, how about quoting this one to him?

If entropy is a measure of energy dispersal, he should be able to

1) respond to each of my my six points against dispersalism, including the rather obvious problem that entropy has the wrong units for energy dispersal;

2) explain why Xavier and Yolanda see different entropy values despite looking at the same physical system with the same physical energy distribution;

3) explain why entropy increases in Denker’s grinding wheel example, though energy dispersal does not; and

4) explain why entropy “cares” about the distinguishability of particles, when energy dispersal does not.

keiths,Ok for the global view. Can you give a more specific example?

colewd,

Why are you asking to be spoon-fed? Google is your friend.

Entropy is a central concept in thermodynamics. You’ll find plenty out there.

Statistical Thermophysics

From the back cover:

A used copy of this book costs only a few dollars.

keiths,First help me understand the substance to your argument. So far I am not convinced that thermodynamic properties can be expressed in bits like binary bits 0 and 1. If you want to define entropy and missing information that is fine as an intellectual exercise. What you have not convinced me of is if that definition has any use in science.

colewd,

Do non-binary bits seem more promising to you?

hi keiths,

The second law of thermodynamics is based on the concept of entropy. You say it establishes an upper bound on the efficiency of heat engines.

How can this upper bound be observer-dependent? Do Yolanda, Xavier, and Damon get different answers for this upper bound?

Mung,

The upper bound on efficiency isn’t observer-dependent. The expression for this efficiency, 1 – Tcold/Thot, doesn’t even have an entropy term in it — only temperatures.

The derivation of that formula, however, depends on the fact that the net entropy change of the system is zero in the reversible (maximum efficiency) case. After a cycle is complete, the entropy of the heat reservoir has decreased, and the entropy of the cold reservoir has increased by an equal amount. The entropy of the system itself is unchanged.

– Robertson p. 467

keiths,Yes, but that is not what Bekenstein is describing when he was calculating the entropy of a black hole. If the bit takes the form of a Tensor then you are now moving toward Sal’s description. Three positions and three momentum variables.

He wasn’t calculating thermodynamic entropy.

Mung,Wasn’t he referencing the second law?

Mung,From Wiki

Beckenstein? Not that I know of.

Such coins would be worthless. value = 0.

DNA_Jock: What would the value be if the coins were made of ununtrium, rather than copper?

Oh, I think they’d have quite the scarcity value…

OTOH, I doubt that they would qualify as

collectible😉

Still, I’d be curious as to how Sal would calculate their entropy…

Do you think Sal knows that Clausius doesn’t always work?

I think it’s pretty clear from his claim that

“Claussius -> Boltzmann -> Shannon uncertainty (aka missing information)”

that Sal has no clue whatsoever.

“Certainly different people have different amounts of ignorance.

The entropy of a thermodynamic systemis a measure of the degree of ignorance of a personwhose sole knowledge about it’s microstate consists of the values of the macroscopic quantities Xi which define its thermodynamic state..”– E.T. Jaynes

I still maintain that DamonEntropy is not thermodynamic entropy and that Jaynes can rightfully be cited in support of that position and against the position that DamonEntropy is thermodynamic entropy.

IOW, I agree with walto. I disagree with keiths.

“There seems to be a breakdown in communication that leads sincere and thoughtful people to disagree, even when they try to understand each other.”– Harry S. Robertson

heh

“The reactions of some readers to my use of the word ‘subjective’ in these articles was astonishing….There is something patently ridiculous in the sight of a grown man recoiling in horror from something so harmless as a three-syllable word. ‘Subjective’ must surely be the most effective scare word yet invented. Yet it was used in what still seems a valid sense: ‘depending on the observer’.”– E.T. Jaynes

So I still say I don’t see a difference between ‘observer-dependent’ and ‘subjective.’

Mung,Based on this can you describe a scenario where someones ignorance is less than 100%?

colewd,

FWIW, I was introduced to the Bekenstein entropy the last day of class after we all submitted our take home final. The professor was pointing out it was something we wouldn’t be tested on. The students sat in class as a learning experience and courtesy to our professor.

Unlike the entropy of Clausius and Boltzmann that are based on heat classical mechanics and the revision of Boltzmann that includes quantum mechanics, the Bekenstein entropy is based on general relativity.

http://scholarpedia.org/article/Bekenstein-Hawking_entropy

Let Keith compute the entropy change of a melting 20 gram ice cube using his information theory as a starting place rather than using energy dispersal parameters of heat and temperature which he so disparages!

How about I make it easier, how about he start from Boltzmann microstates. Let him show how he calculates the number of microstates without the energy dispersal parameters of heat and temperature, but work instead from first principles of kinetic energy of each water molecule. 🙄

Kind of hard for him to do that since he needs to sneak in the parameters that specify internal energy, like, temperature!

If he did that, he’d be almost convincing, but he can’t even do that!

One can do so for most practical systems only after calculating entropy in the traditional way and then using the conversion factors I’ve linked to, namely dividing the answer in J/K by kB and then dividing by ln(2).

I did so in the examples above.

Keiths’ for all his wailing can’t even do a basic calculation without energy dispersal data of heat and temperature like say for a melting ice cube. He keeps framing thermodynamic entropy in terms of information theory, but he can’t actually begin to do an entropy estimate with information theory as a starting point. He has to resort to energy dispersal data in many cases first (heat and temperature).

In contrast, I’ve provided many examples of calculating entropy from a variety of approaches for solids, liquids, and gases.

You’ll note, in almost all cases, implicitly the energy dispersal data is specified by:

Temperature

Number of Paritcles

Volume

Heat

In the case of melting ice or heating copper coins in the solid phase, one only needs Temperature and Heat data.

In no case did I compute entropy from information theory alone, and neither can Keiths, but he won’t admit he can’t compute entropy from information theory alone.

Sorry, I don’t understand the question. Or perhaps I should say that I don’t see how the Jaynes quote gives reason to think that knowing the values of the macroscopic quantities Xi which define its thermodynamic state counts as 100% ignorance.

The point is not that they have no knowledge, but rather that is their sole knowledge.

This is just ignorant and false. It’s ignorant because Salvador has keiths on ignore and so can’t see what keiths says, and it’s false because keiths has specifically denied Salvador’s claims about his views on thermodynamics.

ETA: It’s also an absurd challenge because Salvador fails to specify the thermodynamic system, it’s initial and final equilibrium states, and it’s constraints.

stcordova,Thanks for the link.

If we look at this analysis you see that based on Bekenstein’s computation of black hole (of one solar mass) entropy it is 20 orders of magnitude larger than the sun. This does not seem logical at all. Thoughts?

Mung,How do you ever really know the true conditions of the micro state except what we learn from measuring the macro state? If this is true then 100% ignorance is the only possible condition.

colewd:

keiths:

colewd:

The laughter you hear is coming from folks who understand that bits — binary digits — are binary

by definition.You are confusing the representation of entropy with the representation of the microstate. They are not the same, by any stretch of the imagination.

Mung,

If Sal actually wants to convince anyone that he’s right, he’ll need to respond to his opponents’ arguments. Putting people on ignore — or pretending to — is self-defeating. It just demonstrates his lack of confidence.

It’s no coincidence that Sal put me on ignore just when I asked him to respond to my six points against dispersalism. Interestingly, colewd seems to have no more faith in Sal than Sal does. Otherwise he wouldn’t have hesitated to quote my comment so that Sal could respond.

colewd, to Mung:

colewd,

Mung is explaining that we learn something about the microstate when we measure macroscopic variables like temperature. We still don’t know the exact microstate, but we’ve narrowed down the possibilities by making a measurement.

Salvador’s behavior in this thread bothers me, but I keep reminding myself that it also creates a record. If this were Salvador@UD the posts by keiths and DNA_Jock would be altered by Sal or deleted by Sal. Salvador has me on Ignore because I’m a troll.

Well, now we know just what that analysis of his is worth. Go Sal!

We’re talking about a reduction in the uncertainty of the microstate. And we can reduce our uncertainty. But the absolute precise exact one-and-only microstate can probably not ever be known by any human due to quantum indeterminacy.

The system is dynamic and constantly changing, even at/near equilibrium.

But the fact that we can’t precisely determine something doesn’t mean we can’t narrow things down.

What the information-theory approach and the Boltzmann approach offer is is that we are talking probabilities, and that a statistical approach is close to the reality of the situation.

If you’re saying that we can never know the

exactmicrostate I tend to agree. I don’t see that as 100% ignorance. I think that the claim that if we are not 100% certain then we must be 100% ignorant is a false dichotomy.Take the analogy of throwing a pair of dice. We cannot be 100% certain that a seven will appear when the dice are thrown, nor can we be 100% certain that a four will not appear when the dice are thrown. I offer to play a game of dice with you where every time a seven is thrown you pay me 10 dollars and every time a four is thrown I pay you 18 dollars. Would you play this game?

Mung,

To use ‘$’ in a comment, you can type $ .

ETA: Haha… Now a bug in the plugin is preventing me from spelling out the code. It expands ampersand expressions recursively!

Type & followed by #36; to get the ‘$’ in a comment.

Mung:

The mistake you and walto are making is in thinking that macroscopic variables are the only “admissible” information when calculating entropy. They aren’t. Any information that helps narrow down the possible microstates is admissible.

In the card deck example, “odds before evens” counts as a “macroscopic” description of the deck’s state, and knowing that the deck has “odds before evens” reduces the entropy from what it would be if all you knew was that the deck had been shuffled.

However, macroscopic descriptions are not the only admissible ones when calculating “card deck entropy”. Suppose I shuffle the cards thoroughly and randomly so that all possible orderings are equiprobable. At this point there are 120 possible orderings — 120 possible microstates, in other words. All orderings are possible, and entropy is at a maximum.

I look at the first card and see that it is the ‘4’. How many possible microstates are there now? Only 24. The ‘4’ is now known, and there are only 4! (4 x 3 x 2 x 1) ways of completing the sequence. The entropy has been reduced.

Now I look at the second card and see that it’s the ‘1’. How many possible microstates are there now? Only 6. The ‘4’ and the ‘1’ are now known, and there are only 3! (3 x 2 x 1) ways of completing the sequence. The entropy has been reduced again.

Suppose I keep going until I’ve turned over all the cards. At that point, like Damon, I know the exact sequence. There is only one possible microstate, and the entropy is therefore zero.

It all makes sense: as I look at cards, one by one, I gain information about the state of the deck. As I gain information, the amount of missing information decreases. As the amount of missing information decreases, the entropy decreases, because entropy is a measure of the missing information. When I know the exact microstate — the exact sequence of cards — the entropy is zero.

Anything that increases our information about the microstate reduces the entropy, and this is just as true of thermodynamic entropy as it is of “card deck entropy.”

&36; priceless

One of these days I should learn how to use latex for some other purpose.

ETA: $

You consistently conflate Shannon entropy and thermodynamic entropy. When calculating

thermodynamic entropy, the entropyis definedin terms of the macrostate variables. For example S(E,V,N).Any other “entropy” is not thermodynamic entropy.

Mung,

Look at the full context of the Jaynes quote from the previous comment:

The choice of the Xi is observer-dependent. Once they have been chosen, however, the entropy calculation is objective.

Yolanda’s Xi include the isotopic concentrations. Xavier’s Xi do not. Both are legitimate macrostates leading to legitimate entropy values. Entropy is observer-dependent but not subjective.

Apart from the macrostate variables of thermodynamics, what other knowledge (information) of the microstate is available to us mere mortals?

We are limited to assigning probabilities and dealing with well-known probability distributions. Our only access to the system is macroscopic.

🙂

It’s subjective in the sense that it is observer-dependent. If that were not the case two different observers would not arrive at two different values.

Which of the two, Yolanda or Xavier, should revise their calculation in light of new information which they previously lacked, and why? Or why not?

Mung,

Thermodynamic entropy

isa Shannon entropy.No, it isn’t. You don’t see E, V, or N in

and you don’t see them in the Gibbs equation either.

In the case of Boltzmann, E, V, and N exert their influence via

W. WhateverWis — and it can range all the way from one up to some unfathomably large value — determines the entropy. Macroscopic variables affect the value ofW, but to argue thatWcanonlybe based on macroscopic variables is silly and arbitrary.When Ben-Naim says that thermodynamic entropy is a special case of SMI, I understand what he means because he explains what he means when he says that thermodynamic entropy is a special case of SMI.

When you say that “thermodynamic entropy is a Shannon entropy” I have no idea what you mean because you do not explain what you mean.

Let’s assume that not all Shannon entropy is thermodynamic entropy. How do you explain the difference? How is thermodynamic entropy different from Shannon entropy?

Let me explain further. Ben-Naim asserts that SMI can be defined for any probability distribution (e.g., your ‘deck of cards’ examples). But thermodynamic entropy is not defined for just any probability distribution. Not every probability distribution is relevant to thermodynamics.

Your response?

Please elaborate on this statement.

https://en.wikipedia.org/wiki/Entropy_(statistical_thermodynamics)#Gibbs_entropy_formula

Mung,

The ‘&’ followed by ‘#36;’ trick is html, not latex, so you can use it wherever html is accepted. No latex plugin needed.

In fact, it’s only because of the latex plugin that you need the trick in the first place.

Some Misconceptions about Entropy

Mung,

I assumed that you understood what a Shannon entropy is. If not, you could look it up.

Simple. A thermodynamic entropy is an entropy in which the microstates are thermodynamic microstates (for example, an ideal gas microstate that specifies the position and momentum of each molecule). An observer possesses a certain amount of information about the exact microstate of the system. The gap that exists between that information and the information required to pin down the exact microstate is the entropy.

Thermodynamic entropy is defined for any epistemic probability distribution over thermodynamic microstates, just as “card deck entropy” is defined for any epistemic probability distribution over card deck microstates.

Remember, the system is really in just one microstate at a time. The uncertainty is all in our heads — that is, the probability distribution is epistemic. Any experiment that changes the probability distribution changes the entropy. We gain information, the probability distribution changes, the entropy decreases.

Whether the information is gained via “macroscopic” or “microscopic” observations is irrelevant. Any information that helps us zero in on the exact microstate is perfectly fine.

And if we happen to discover the exact microstate, as Damon does in the thermodynamic case and as we can do in the card deck scenario, then the entropy is zero.

Mung,

You are very close to getting it.

The reason thermodynamic entropy calculations are usually based on macroscopic variables is not because other information is illegitimate. It’s because other information is damn hard to get!

Contrast that to the card deck example, where “microscopic” information is easily available. We can all be Damons in the card deck world, as I explained above:

There is no rule against “microscopic” information in the card deck scenario, and likewise there is no rule against microscopic information in case of thermodynamic entropy. It’s just that microscopic information is far easier to get in the card deck case than in the thermodynamic case. Scientists use the information they have, and typically the only information they have is macroscopic.

keiths:

Mung:

Calling it ‘subjective’ just invites misunderstanding, which is exactly what Jaynes experienced:

I anticipated the problem and avoided it. Jaynes had to learn the hard way.

It depends on what they are trying to achieve. They both calculated the entropy correctly, so if that is all they were trying to accomplish, then they’ve achieved the goal and nothing more needs to be done.

Entropy can vary with time, due both to the evolution of the system and changes in the observer’s epistemic status. If the goal is to keep the entropy value up to date, then of course it needs to be recalculated when relevant changes occur.

Mung,Thanks. Can you show me how to estimate the probability of a given micro state?