The basic biochemistry textbook I study from is *Lehninger Prinicples of Biochemistry*. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system,

entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, GlossaryLaurence A. Moran, University of Toronto

H. Robert Horton, North Carolina State University

K. Gray Scrimgeour, University of Toronto

Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conﬂates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, ﬁre and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the ﬁrst conﬂation to be an error.

…

“Disorder” is an analogy for entropy, not a deﬁnition for entropy. Analogies are powerful but imperfect.When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?

Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga

2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very

ordered— and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itselfit rapidly proceeds to the disordered most probable state.”

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:

EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy

…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.

..

Lehninger, Nelson and Cox

….

Garrett and Grisham

…

Moran and Scrimgeour

….

Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

…

A few creationists got it right, like Gange:

Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy issimply a quantitative measure of what the second law of thermodynamics describes:the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly,entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where

S = entropy

k = boltzman’s constant

W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where

delta-S = change in entropy

dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

walto,

If arguments from authority keep backfiring on you, will you eventually learn to stop making them?

Some examples from earlier in the thread:

Hey, I believed those remarks were correct when I wrote them, and I believe they’re correct now! (Of course maybe those with not one, but TWO gnawed off feet can’t be trusted on this stuff.)

It’s dark down here.

For starters the exact microstate includes observer effects via quantum mechanics from the fellow doing the measurement.

To know the exact microstate would entail knowing everything about ourselves. I don’t think that is possible.

The exact order of cards is a macrostate. You are doing just what you accuse Granville Sewell of doing.

peace

Your “thermodynamic information” is certainly not the same as your “missing information” derived from the entropy function. What are the units of your “thermodynamic information.”

fifth,

That’s as silly as saying that I can’t know the order of a deck of cards because I “don’t know everything about myself”.

No, the exact order of cards is a microstate (but in a logical, not a thermodynamic sense). For someone who knows the exact microstate — the exact order of the cards — it is also a macrostate. For other observers, the macrostate will include more than one microstate.

Did you read my card deck comments? (Link, link, link, link)

I wonder, does dark entropy measure the dispersal of dark energy?

Mung,

Of course it isn’t.

The thermodynamic information establishes the epistemically possible microstates. The missing information — the entropy — is the

additionalinformation that would be required to narrow the microstates down to one.I know the exact order of the cards AND whether or not each card is face up or face down. I know more than Damon. 🙂

walto,

That, ironically, is a demonstration of your ignorance. The missing-information view of entropy is not something I made up out of the blue.

Wikipedia:

Murray Gell-Mann:

And Ben-Naim, of course.

The arguments from authority just keep backfiring on you, walto.

But you disagree with this quote. You deny the subjective component.

According to you the macrostate is not subjectively chosen but rather that different observers will make different observations and thus come up with different answers about the entropy. Therefore the entropy is not subjective, but merely observer-dependent.

If that’s not your argument I wish you would share with us what your argument is. What is the difference between “observer-dependent” and “subjective choice”?

How does anyone know* what macrostate the system is in? You mean it’s not a subjective choice?

Mung,

No, I disagree that

entropyis subjective. It’s observer-dependent but not subjective.The choice of macrostate, on the other hand, can be made on a completely subjective whim, for all I care. It’s just that once the macrostate is chosen, the entropy is objectively determined.

Mung,

The choice of how to specify the macrostate is up to the observer, subject to the limitations of their abilities and equipment.

In Xavier’s case, he specifies the

beforemacrostate as “equal amounts of gas X at the same temperature in two equal-sized compartments separated by an impermeable barrier”.In Yolanda’s case, the

beforemacrostate is “equal amounts of isotope X0 in one compartment, and X1 in the other compartment, at the same temperature and separated by an impermeable barrier”.Xavier doesn’t possess Yolanda’s isotope-discriminating equipment, so he cannot choose to use a Yolanda-like macrostate. Yolanda, however, could choose to ignore the isotopes and use a Xavier-like macrostate.

In any case, their different macrostates result in different entropy values. Damon’s macrostate results in an entropy of zero.

If entropy is

notobserver-dependent, as you claim, then which of the three (if any) is correct, and on what basis?All irrelevant.

I do believe I’ve already answered this question.

Damon is not calculating the entropy. One of the other two is simply wrong, because his/her belief about the macrostate is false.

Given that we’re talking about authorities here, perhaps you meant to say that’s all irreverent?

btw keiths, returning to another discussion we were having upthread:

1.) Do you believe that all irreversible processes are not in fact processes, or that only some irreversible processes are not in fact processes?

2.) Do you believe that all reversible processes are not in fact processes, or that only some reversible processes are not in fact processes?

3.) What does the initial and the final macrostate have to do with whether or not an irreversible process is in fact a process, or whether or not a reversible process is in fact a process.

4.) Or were you just mistaken?

Would that “additional information” be “thermodynamic information” measured in “thermodynamics bits”? Don’t go all Salvador on me!

walto,

You wish.

Having dispensed with your arguments from authority, let’s return to this:

Walto,

Wow, I can’t believe you have the patience and relentlessness to dredge up some golden nuggets from way back when!

Yeppers. Like the change in entropy of an ice cube melting after being hit with 6660 Joules of

heatenergy at atemperatureof 273 K:6660 Joules / 273 K = 24.4 J/K of entropy

Mung,

Which one, and why?

Mung,

If you want to understand my position, why not read my comments? The discussion begins with your comment here.

Information is measured in bits, Mung. Do you think your checking account information is measured in “checking account bits”?

Mike Elzinga:

Sal:

Nopers. The Clausius equation only works in certain circumstances.

It doesn’t work for the gas-mixing case we’ve been discussing, for instance. The process isn’t reversible and there is no dQ or ΔQ. If you try to use the equation, you’ll get an entropy change of zero, which is incorrect.

Salvador, as if “heat energy” and temperature are unrelated.

Referring back to these comments about the history and various definitions of entropy:

http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-16/#comment-146939

http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-16/#comment-146937

I will now start talking a little about the Boltzmann definition of entropy and information theory. As I said, the Clausius definition has nothing to do with information theory, and Clausius is the dominant practical computation of entropy in engineering and chemistry.

However, under certain circumstances the Boltzmann definition fortuitously agrees with the Clausius definition even though they proceed from radically different starting points. Clausius proceeds from heat, Boltzmann from microstates.

Now a word about microstates. It is easiest if we first start with a single particle. The familiar 3 dimensional Euclidian space has coordinates X, Y, Z. We can extend this space to 6-dimensional phase space where a particle not only has positional coordinates (X,Y, Z) but also momentum coordinates (P_x, P_y, P_z) where momentum is

P_x = m V_x

P_y = m V_y

P_z = m V_z

where

m = mass

V_x = velocity in the X direction

V_y = velocity in the Y direction

V_z = velocity in the Z direction

we can thus can find a gas particle bouncing around in some volume V at any point in phase space defined this way. We could, conceptually insist we round off all the coordinates so that there are only a finite number of “locations” of the particle in phase space. It is a sort of quantization of the phase space.

If the particle has fixed velocity, its velocities along X, Y, Z can still jump around. So for a given velocity, the particle has a fixed energy, but it can have varying components along the X, Y, and Z axis provided the particle’s total velocity obeys the pythagorean theorem:

V_total ^2 = V_x ^2 + V_y ^2 + V_z ^ 2

Suppose the single particle (atom) is bouncing around by itself in a container. It can hypothetically be anywhere in the container and have a variety of velocity coordinates (or alternatively momentum coordinates) at any time. Thus its “location” in phase space is defined by 6 coordinates at any time by ( X, Y, Z, P_x, P_y, P_z).

How we slice up the phase space and decide how to do the rounding, I don’t know exactly, but for 1 cubic meter at 300K, with a helium atom, using the Sakur-Tetrode approximation, the conventional rounding resulted in the following number of microstates:

W = 9.52 x 10^31 microstates

So this singe atom in a 1 cubic meter volume at 300 K can located in a microstate that is defined by ( X, Y, Z, P_x, P_y, P_z). There are only a finite number of microstates inside a finite volume for a given total velocity of that atom. At any given time the particle has coordinates ( X, Y, Z, P_x, P_y, P_z). Alternatively we can say the particle is in a microstate with ( X, Y, Z, P_x, P_y, P_z) coordiantes.

We could plug the number of possible microstates the particle can be in into the famous Boltzmann equation and get the entropy in J/K:

S = kB ln (9.52 x 10^31) = 1.02 x 10^-21 J/K

Suffice to say, to define a finite number of microstates, at some point the coordinates (X,Y,Z, P_x, P_y, P_z) can’t have infinite precision of an infinite number of digits past the decimal point. It has to be finite such that we end up with a finite number of microstates. How Gibbs and Boltzmann figured this out is testament to their genius.

In any case, we could, if we wanted, use the log base 2 instead of natural log, and we can also drop Boltzmann’s constant kB and define a Shannon-like entropy :

S = log2 (W) log2 (9.52 x 10^31) = 106.23 Shannon bits

Thus we can force-fit information theory type constructs on entropy just by adopting a different logarithmic measure and dropping kB. Big deal! 🙄

I fail to see why all the hoopla and deepity woo observer-dependent ignorance and information theory in the definition of entropy.

Keiths can’t seem to comprehend, the number of microstates is not experimentally possible to establish in many cases without the energy dispersal parameters defined by HEAT and TEMPERATURE just as Dr. Mike pointed out.

So for all of Keiths and DNA_jock saying Lambert is wrong, they can’t even get on first base for many experimental applications without energy dispersal parameters. Keiths and DNA_jock are too much of dabblers to contend with Frank Lambert and Dr. Mike who actually taught these subjects in University. They’re just eager to disagree with me, especially DNA_jock who resorts to misreading and grabbing at straws in attempt to claim I don’t understand this stuff.

Keiths disagreed with Dr. Mike as well, who was a professional thermodynamicist working on extremely cold refrigerators.

I pretty much diss the Keiths, DNA_Jock school of entropy. I can’t imagine one will go very wrong following Frank Lambert and Dr. Mike.

I’ve already answered that.

Nah, what I wish is that it were possible to have a discussion with you without you turning it into a shit show when people disagree with you. You really do poison this site. Your antics don’t actually bother me anymore, but you’ve driven some good folks away.

It is or it is not subjective? Assume Salvador is just a “weird” observer. Salvador’s observation is different from yours. Which of you is wrong? And why?

Keiths provides sparring and batting practice.

This website is an extension of my thought process, not really an official publication of my views. It’s a place to get free-of-charge hostile review of some of my teaching materials.

If I didn’t have that attitude, I’d have left here long ago.

It’s possible that both Xavier and Yolanda are wrong but Xavier is more wrong because he has less information.

Damon is not even able to calculate the entropy.

The Ideal perspective for this task is Damon’s minus one piece of information.

peace

It’s Salvador’s subjective model of choice. If X, Y, and D can all be right, why can’t Sal be right?

What utter effing nonsense.

Which is why people who disagree with Sal are on Ignore. LoL!

I’ve been convinced that it’s mostly right, myself. Adami’s blog is particularly good on the subject, I think. But once the variables are no longer thermodynamic state variables, the entropy isn’t thermodynamic entropy and Damon is a reductio. But it’s not terribly important, one way or the other. Not worth fighting about really, and I can see why Moran shrugged.

I understand Lambert was very hot about this matter at one time, but I’m guessing he he wouldn’t object to the Ben-Naim definition of thermodynamic entropy I posted above, which is a function of of paucity of (certain) info.

At this point, I think there’s probably just keiths, looking for somebody to fight with. He’s like the parody of Hemingway in Woody Allen’s

Midnight in Paris.No, It’s like saying that you can’t know the order of a shuffled deck of cards if no one has observed them.

Is the precise order of a frames in a movie a “logical” microstate?

https://www.youtube.com/watch?v=xTikNBYA0tg

peace

No, it is not. You are simply wrong. And this is further evidence that you are in fact conflating the colloquial sense of information with the Shannon measure of information. No one knows how to measure colloquial information. People can’t even agree on a definition of colloquial information.

Colloquial information does not conform to a probability distribution, and the Shannon measure of information (which does have the unit of ‘bits’) lacks any meaning apart from a probability distribution. Likewise “entropy.”

And Shannon never claimed he had defined a measure of colloquial information.

You’re a bright guy, you’ll figure this out. Don’t just disagree with me for the sake of it. 🙂

I absolutely agree with the part in bold. Entropy is a state function. How are the values of the state variables determined? How would Damon determine their values?

The hell it does. 😉

Do you mean to say that the thermodynamic information establishes the objective probability distribution? And do you mean to assert that the thermodynamic information needed to establish the objective probability distribution can be measured in ‘bits’? And the missing thermodynamic information has it’s own entropy. Obviously.

Scientists use things like thermometers and barometers. You’ll have to ask keiths what demons use.

walto,

Who cares? It’s a thought experiment.

Sal:

walto:

Poor Sal. You should have warned him that you were about to dump him, walto.

keiths:

walto:

No, you tried to dodge it:

walto:

keiths:

walto,

The shit comes from your chronic immaturity and refusal to acknowledge your mistakes or take responsibility for what you’ve written. See the comment immediately above for an example.

Grow up, please.

keiths, who is never wrong, has no need of humility.

fifth,

Then you’re backing yourself into the same corner that walto is (was?) in. Read on.

Sure he is. He knows the exact microstate, so W = 1. Plugging into the equation, he gets

S = kb ln W = kb ln 1 = 0.

The entropy is zero.

So by your reasoning, Yolanda is more correct than Xavier because she has more information, and Damon is as correct as it is possible to be because he knows the exact microstate. Therefore entropy is zero all the time, and it’s useless.

Mung,

When I know I’m wrong, I admit it. Would that walto could summon the courage to do likewise.

By my count you’ve been wrong

at least twicein this thread alone, without having admitted to being wrong.Perhaps walto doesn’t know** that he’s wrong.

As you admit to not even knowing your own name, I guess nobody should be surprised that there’s never been such an admission.

Maybe you’ve been admitting* it? Have you admitted* you were wrong about mung thinking rape is ok yet? Have I just missed that admission*?

I guess I shouldn’t tease you though, It’s obvious how hard it is for you when nobody agrees with you about this or that. And it happens on so many threads! So, I (indeed everyone!) should try to be more sympathetic when you act out.

fifth:

keiths:

fifth:

No, because Damon learns the exact microstate by observing it, and humans learn the order of a randomly shuffled deck the same way — by observing it. Either way, it has nothing to do with “knowing everything about yourself”.

You can define it as one, in which case the other microstates would be created by cutting and splicing. I have no idea why you think a video of a tornado is relevant, however.

Mung:

Make your case. If I think you’re right, I’ll say so.

I’ve tried to make my case. You need to respond to those posts.

1.) Your position on what is or is not a process in thermodynamics.

2.) Your position that all information is measured in bits.