The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
That’s all right. I’ve ignored Sal for months now. Just scroll past. Haven’t missed anything, judging from the responses to his drivel.
Joe ,
It’s hard to overstate just how poor Granville’s thinking is on entropy and related topics. Here’s a comment I made at UD about the paper he presented at the Biological Information ID conference:
walto,
I’ve posed this question to you multiple times, but you’ve refused to answer:
I’ve been pushing you to tackle this question for pedagogical reasons. You won’t be able to answer it, and the reason for that failure will teach you something important about entropy.
Give it a shot, and then we can discuss it.
Here, from his Entropy Demystified Ben-Naim gives the basic argument for his case (which keiths and mung take up here) that entropy is (and has always been) information:
I encourage those still reading this “amusing” thread to read Ben-Naim’s argument carefully and make their own judgments about its force. Then they can come to their own conclusions about this matter (which, as I’ve said, seems to me of very little scientific importance–other than maybe pedagogical).
There may be some philosophical interest to this stuff, but the issues are complicated and, to me at least, difficult.
In attempt to de-mystify entropy, I will provide a Microsoft Excel spreadsheet where the user can enter values in the orange colored boxes for:
moles gas:
volume:
temperature:
The result of the computation is the absolute entropy of helium (to a reasonable approximation).
I put some default numbers in and you can see the red highlighted text the absolute entropy.
Here is the spread sheet:
Absolute Entropy Helium
The ULR is:
creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls
Change the numbers in the orange text and see the change in entropy. The numbers in the orange text defines the energy dispersal parameters. All I did was put the formula for the alternate form of Sakur-Tetrode equation below.
The details are tedious, but at root, this is just mapping one number (entropy) to 3 numbers (moles gas, volume, temperature). The temperature is used by the spreadsheet to compute the amount of energy in the Sakur-Tetrode equation and the other things that are plugged into the Sakur-Tetrode equation below. If any one wants to suffer to how the numbers are plugged in they can study the spreadsheet. Ugh!
For other materials and situations, the equations are different, but I don’t see the need to be invoking all this ignorance definition of entropy and observer-dependent woo and information theory.
walto,
The two points that Ben-Naim (correctly) makes in that short quote were already covered earlier in the thread, and that quote is by no means a complete argument for the “missing information” view of entropy. Readers should not base their judgments on that tiny snippet.
I’ve given at least six reasons why the “energy dispersal” view cannot be correct. Here they are again:
Each of those points is sufficient by itself to scuttle the “entropy as energy dispersal” misconception. Good luck to any dispersalist who tries to rebut all of them.
keiths,
I agree with your comments on Granville Sewell’s paper. Trying to give his argument its best shot, I find that reading his equations, all that is there is dispersal of the matter (say, carbon). If there is something there that allows chemical interactions, electrostatic effects, or even gravity, I’d be pleased to be corrected. So far others have reached the same conclusion — he leaves out all the interesting stuff that enables plants to grow, organisms to reproduce, etc. Then he declares that this equation shows that evolution can’t happen. Unfortunately for him he has also shown that plants can’t grow.
Whether his “X-entropy” is an actual entropy may connect with the argument here. But as far as I can see whether or not it can be called entropy is irrelevant, in that either way there is no justification for his assertions about what can happen.
walto,
It’s of huge scientific importance. Concepts matter in science.
It’s true that a scientist or engineer who is merely plugging numbers into the equations won’t see a difference. The equations work whether you understand them or not. So for the purposes of practical calculations, it doesn’t matter whether you see entropy as a measure of disorder, energy dispersal, or missing information.
But if you actually want to understand entropy, and not just plug numbers into equations, then the concepts are crucial.
Entropy is not disorder, and Lambert was right to argue against that common misconception. His mistake was to replace one misconception with another. Entropy is not energy dispersal, either, and if you think it is, you don’t understand entropy.
Entropy is a measure of missing information — the gap between the information associated with the macrostate and the information associated with the microstate.
Sal,
Here are six good reasons. Can you rebut any of them?
I think you should post them again. You haven’t done so for a couple of minutes and you’re obviously quite fond of them.
walto,
Can you rebut any of them?
ETA: Or answer this question?
I can’t remember what they are. Could you post them five or six more times please?
Revisiting the question of the shuffled cards…..
As I showed, we can calculate the entropy of any amount of copper. In theory we could calculate the entropy of paper and ink needed to make cards.
Will that (thermodynamic) entropy figure change whether we shuffle the cards or not? NO!
One could imagine “cards” engraved with the suits and rankings (Ace,2,3,4,5…10,Jack, Queen, King) on copper. We construct a “deck” made of these copper cards. We know from here if the deck had a total weight of 1555 grams, the entropy is
812.42 J/K
Doesn’t matter how we order or “shuffle” the deck, the entropy is going to be 812.42 J/K unless we change the temperature.
Sal,
That’s almost, but not quite, correct. The order of the cards will have a small but nonzero effect on the thermodynamic entropy.
See this comment.
Only if it’s warm enough,Sal.
😉
Get below a kelvin and it goes distinctly pear-shaped. Why is that, Sal?
I also have a couple of questions about your calculation of the entropy of 1555 grams of copper = 812.42 J/K.
One of your inputs was the specific heat of copper = 0.39 J/gram/K.
Do tell:
1) Do you normally quote results to 5 sig figs, when one of your inputs has 2 sig fig precision?
2) How was that 0.39 J/g/K value derived? Experimentally?
3) Is it constant?
I would encourage you to think about the implications of your answers to 2 and 3.
Moved a comment to Guano. Address the ideas, not the person and assume everyone is participating in good faith.
walto,
With Sal in full retreat, it looks like you’re the only one left defending the “entropy as energy dispersal” view.
Yet you can’t rebut my six points or answer my question.
Any readers out there who would like to step in for walto?
Here are Lambert’s answers to keiths’ SIX BELOVED POINTS:
http://entropysite.oxy.edu/boltzmann.html
Here’s a link to Sal’s guanoed comment.
I would prefer a plugin that simply hid the comment until it was clicked on or inserted a link. If you keep doing this I have no reason to look for one. 😉
walto,
How does any of that refute my six points? Be specific.
Here’s a suggestion. Quote each of my points, one by one, and underneath each of them, explain how Lambert’s quote refutes it.
Patrick,
Me too.
When I finally succeed in escaping the clutches of my employer, I’ll look into an implemention of the “moderation as a subscription service” idea that Lizzie liked.
Thanks, but here’s my suggestion. You respond to each of Lambert’s objections to the information theory you have been plumping here and explain how you can rehabilitate your view in spite of his remarks. Then, if I’m not convinced by your attempt to rehab your view, I’ll chime in.
Be specific.
Ok, if I provide the Sakur-Tetrode equation that is for monatomic ideal gases, comments that insinuate that I ever suggested Sakur-Tetrode applies necessarily to the liquid state are retarded and idiotic and trollish.
Heh.
walto:
keiths:
walto:
walto,
As for Lambert’s objection, you’ve already quoted the refutation. You just didn’t realize it.
Lambert’s complaint is that the equation for thermodynamic entropy includes Boltzmann’s constant (kb), while the equation for informational entropy does not. He thinks that the informationists are therefore cheating by bringing kb into the equation:
That’s nonsense. The only reason kb even appears in the thermodynamic entropy equation is because of the choice of units. I explained that earlier in the thread:
Ben-Naim makes the same point here, in the very excerpt that you quoted, walto:
Conclusion: Lambert is wrong. The use of kb is just an accident of history, caused by the fact that the kelvin is (unnecessarily) a base unit in the SI system.
Physics obviously matters to thermodynamic entropy, but it does not get into the equations via kb. It gets in via the W in Boltzmann’s equation and via the probabilities in the Gibbs equation.
Sal, you seem very quick to take offense. You used Sakur-Tetrode to obtain “the absolute entropy of helium (to a reasonable approximation).”, without any qualification whatsoever. I was merely pointing out that Sakur-Tetrode goes of the rails at low temperatures; maybe I was a little unfair in referring to (very low) temperatures below one Kelvin — I was enjoying myself too much getting your spreadsheet to return negative values. You are correct that, at one Kelvin, 4He is no longer a gas, but the 3He can remain in gas form under these conditions, so, according to your specification, Sakur-Tetrode should still apply. Does it?
Gosh there’s an interesting thought: what happens to the specific heat capacity of 4He and 3He under these circumstances?
You never did answer my questions about specific heat capacity. Is it a constant? How do you know?
P.S. Have you read Ben-Naim’s derivation of Sakur-Tetrode? He breaks it down into four components.
I’m glad you put the Ben-Naim quote I excerpted up again. I guess you concur after all that it is key to this issue. As I said when I posted it, I encourage everyone to read it carefully and determine for themselves how much force it has.
FWIW, this is the passage that I think is central (but again, WTHDIK):
Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W….
walto,
No, you aren’t glad. It shows that you had already quoted a refutation of Lambert’s objection without even realizing that it amounted to a refutation.
You claimed it was Ben-Naim’s “basic argument for his case.” It isn’t, as I explained:
walto:
Both you and I quoted it. Problem is, you didn’t realize that it was a refutation of Lambert, who in your later quote falls into exactly the trap I described: thinking that the physics comes into the equation via kb. It doesn’t, and in fact kb is altogether unnecessary, as Ben-Naim and I point out.
walto,
I’ve shown you that Lambert’s objection is fallacious. Yet you claimed that it was somehow a response to all six of my points against dispersalism:
How does that single fallacious objection refute each of my six points? Be specific.
Ah, the trap!
Thermodynamics
an introduction to the physical theories of equilibrium thermostatistics and irreversible thermodynamics
Herbert B. Callen
John Wiley and Sons, Inc.
1960
Postulate I. There exist particular states (called equilibrium states) of simple systems that, macroscopically, are characterized completely by the internal energy U, the volume V, and the mole numbers N1, N2, … Nr of the chemical components.
Postulate II. There exists a function (called the entropy S) of the extensive parameters of any composite system, defined for all equilibrium states and having the following property. The values assumed by the extensive parameters in the absence of an internal constraint are those that maximize the entropy over the manifold of constrained equilibrium states.
Postulate III. The entropy of a composite system is additive over the constituent subsystems. The entropy is continuous and differentiable and is a monotonically increasing function of the energy.
Postulate IV. The entropy of any system vanishes in the state for which … (that is, at the zero of temperature)
This postulate is an extension, due to Plank, of the so-called Nernst postulate or third law of thermodynamics. …
The foregoing postulates are the logical bases of our development of thermodynamics.
Thermodynamics is the study of the macroscopic consequences of myriads of atomic coordinates, which, by virtue of the statistical averaging, do not appear explicitly in a macroscopic description of a system.
Now, what are the chief consequences of the existence of the “hidden” atomic modes of motion with which thermodynamics is concerned?
– Callen (1960). p. 7
Missing information, anyone?
I’m repeating myself, yes. But Salvador claimed in his OP that there is no relationship between entropy and information. He was wrong.
The particular simple states to which thermodynamics applies are called equilibrium states.
– Callen (1960). p. 11
If Salvador, or keiths, are not talking about equilibrium states, I question whether they are talking about entropy.
Lambert:
Probability is an essential consideration.in mixing or expansion, in the sense of a spatially broader and thus of a probably greater distribution for the motional energy of each constituent’s molecules.
Perhaps Sal can explain this.
I can’t speak for Sal, but I’m talking about equilibrium states when I speak of entropy.
However, Callen’s statement is incorrect:
Google ‘non-equilibrium thermodynamics’.
Does your argument depend on non-equilibrium thermodynamics?
Mung.
No.
Mung: Does your argument depend on non-equilibrium thermodynamics?
You mean keiths’ argument? Maybe it would be helpful if somebody posted it again so we could all see it.
It seems to me that point of thermodynamics is that no one knows the actual microstate. If that’s not an argument for “missing information” I don’t know what would be.
Lambert:
Lambert:
As seen above, Lambert agrees:
walto,
I didn’t mean to attribute that Lambert quote to mung. My mistake.
walto,
Again, Lambert’s mistake is to think that if k doesn’t involve energy, then the entropy in question can’t be thermodynamic entropy.
That’s simply wrong. The J/K units are an accident of history, as explained above. Had the kelvin been defined in terms of energy, an entropy in J/K would have been energy/energy, and energy would cancel, leaving a dimensionless quantity.
Physics comes into the entropy equations not via the kb constant, but via the number of microstates (the W in the Boltzmann/Planck equation) and the probabilities (the pi in the Gibbs equation).
You can’t determine W and the pi’s without considering physics. That’s what puts the “thermodynamic” into “thermodynamic entropy”.
Yeah, I found Lambert’s argument just weird.
walto,
Do you see now why Lambert (and Sal) are wrong?
I can’t. I’ve fallen too far down into the trap and I can’t get up!
Oh, help!
Is that just obstinacy, or do you genuinely think that Lambert’s position is defensible?