The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
Heh. Not even if it’s hot enough to melt the frozen in entropy?
Imagine a tabletop covered with nickels and dimes, flying around and bumping into each other. If a “heads” bumps into a “tails”, they both flip. There are more microstates where the heads are evenly distributed between nickels and dimes than microstates where almost all the heads are on one type of coin. Over time, the proportion of heads will become roughly equal between nickels and dimes.
THAT’s why “Heat cannot, of itself…”
Yes. So?
Well Gibbs doesn’t need to describe them, just count them (or more often, calculate the ratio of W1 to W2).
And very few people really care about the isothermal expansion of ideal gases.
On the other hand, good luck explaining the chelate effect or the hydrophobic effect in Clausian terms…
Sal is mostly repeating mistakes that were pointed out earlier in the thread. This one, however, appears to be a new mistake:
There’s a lot of wrongness in those two sentences.
First, we’ve been talking about mixing, not unmixing. The entropy increases during mixing but energy dispersal does not. Therefore entropy cannot be a measure of energy dispersal.
Second, the mixing process is not reversible. “Reversible” in thermodynamics is a technical term. It does not refer merely to something that can be undone; it refers to a process that can take place with no net change to the entropy of the universe, and can then be reversed, also with no net change to the entropy of the universe.
The system is isolated during mixing, meaning that there is no exchange of energy with the environment. The entropy of the system increases due to mixing, but the entropy of the environment does not. Therefore there is a net increase in the entropy of the universe, and the process is not reversible. (Successfully running it backwards would decrease the entropy of the universe, in violation of the Second Law.)
Third, if the mixing actually were a reversible process, as Sal claims, then any heat removed from the system during unmixing would have to be exactly equal to the heat added during mixing. No heat is added during mixing. Therefore, if the mixing were reversible, no heat would be removed during unmixing, contrary to Sal’s claim.
Fourth, it should be obvious that heat cannot be removed during unmixing, because that would decrease the temperature. The ending temperature would be lower than the starting temperature. You haven’t reversed a process unless you end up back at the starting line.
Fifth, merely removing heat from the system is not going to get you back to a condition in which the molecules of gas A are all on one side of the chamber and the molecules of gas B are all on the other.
Sal has no business trying to teach thermodynamics to anyone. He doesn’t even understand the basics.
We’ve mostly been talking about gases so far. Here’s a solid-state example of why entropy is not a measure of energy dispersal, from John Denker of Bell Labs:
He’s right, and Sal is wrong. Again.
Hi keiths, how do you propose that we measure the entropy of the universe? What calculation do we use to determine the entropy of the universe?
According to Ben-Naim the entropy of the universe is undefined and therefore a nonsense term.
Mung,
The absolute entropy of the universe doesn’t matter. It’s the change in entropy due to a particular process that concerns us here.
Not sure why you think that helps your claims. The change in entropy of what, the universe? What would that even mean?
In order to determine the entropy change you need to know the state of the universe, because the entropy is a state function. What state is the universe in?
Do you know the temperature, volume, and number and types of all the particles in the universe?
You really need to read more Ben-Naim. 🙂
Mung,
You need to read up on reversible processes.
From Wikipedia:
Note the word “surroundings”.
As keiths notes, physicists talk of the change in entropy of the rest of the universe all the time. Any system that exports energy increases the entropy of the rest on the universe by at least dQ/T. The fact that this heat, along with all the other heat in the universe, is flowing down temperature gradients in immeasurable quantities doesn’t affect the validity of the statement at all.
They also claim the universe gets more disorded, and that the energy of the universe gets more dispersed. Nothing prevents physicists from saying nonsensical things. In fact, that’s at least one thing this whole thread has been about.
So it ought to be no surprise that they might say more nonsensical things, especially when it comes to entropy, the second law of thermodynamics, and the arrow of time.
And what does it means to say that the “missing information” of the universe is increasing? As we learn, we become more ignorant? That’s a humbling thought.
🙂
If the entropy is increasing in an isolated system, does that mean the entropy is also increasing in the universe? Because, you know, the isolated system is part of the universe.
Sal’s law: As Salvador gets smarter, everyone else becomes more ignorant.
: Stupid Physicist Sayings
This is the story of the bit and the universe. The universe is the biggest thing there is and the bit is the smallest possible chunk of information. The universe is made of bits. Every molecule, atom and elementary particle registered bits of information.
– Seth Lloyd (2006)
Mung,
Since you are echoing (or trying to echo) Ben-Naim here, why not provide a relevant quote from him? Then the rest of us can tell you whether we agree or disagree, and why.
Mung:
That’s goofy, Mung.
If I deposit $10,000 into my checking account, the change in balance associated with that transaction is $10,000. I can say that even if I don’t know the balance before and after the transaction.
Let B be the “before” balance, and let ΔB be the amount of the deposit. Then the “after” balance is B + ΔB.
The change in balance is the “after” balance minus the “before” balance, or
B + ΔB – B = ΔB
The Bs cancel out. The answer depends only on ΔB, not on B.
Mung,
Yes, the entropy is also increasing in the universe. Can you work out how we know that?
Mung,
The same thing it means to say that the missing information of any isolated system is increasing.
Keep in mind that the Second Law is probabilistic, not absolute. In the very long run, there can be downward fluctuations in entropy. That’s what Boltzmann brains are about.
In terms of “missing information” measured in “bits” using thermodynamcs? I confess I cannot. Can you?
Do you know the Pressure of the universe? Do you know the Volume of the universe? Do you know the Temperature of the universe? Do you know the total energy of the universe? Do you know the number of particles? Do you know whether the universe is at equilibrium?
I’m beginning to get a sense of what walto must have been experiencing. Yes, if there was a change in the balance of your bank account there was a change in the balance your bank account. Well, yeah.
If you withdraw 10,000.00 from your bank account does the bank account of the universe increase or decrease or remain unchanged? If you deposit 10,000.00 into your bank account does the bank account of the universe increase or decrease or remain unchanged?
So “the entropy of the universe” can, in fact, decrease?
The universe is an isolated system?
Try to focus, Mung.
Do you see why, if I deposit $10,000 into my checking account, the change in balance associated with that transaction is $10,000? Can you see that this remains true even if I don’t know the account balance?
Once you understand this, we can move on to entropy.
A quote from him about the entropy of the universe? Sure.
In the meantime, you can tell us how to measure the entropy of the universe.
keiths:
Mung:
Yes, though it’s enormously unlikely to do so anytime soon.
Ditto for smaller isolated systems, like the containers of gas we’ve been talking about. It’s not impossible for all the molecules of an evenly distributed gas to concentrate themselves in one corner of the container — just mind-bogglingly unlikely.
Mung,
Why, when nothing I’m saying depends on knowing the value?
I am focused. Let’s assume your bank account is an isolated system. How do you pass your 10k across the boundary? What travels back across the boundary in the opposite direction, your balance?
Entropy is not a function of time. “Anytime soon” has nothing to do with it. But at least you are now talking about changes in entropy of the universe. How do we measure the before_entropy and the after_entropy of the universe?
If my bank account is an isolated system, then nothing passes across the boundary. By definition.
You know the definition of an isolated system, right? It’s a simple matter of applying the definition. Why can’t you figure these things out for yourself?
Mung:
That’s a keeper.
If you want to claim that “the entropy of the universe” is a meaningful concept then you ought to be know the before and after values, because if you don’t, you can’t calculate the change.
How do you know 10k was deposited into your bank account? You look at he balance before the deposit and the balance after the deposit.
balance_after_deposit – balance_before_deposit = deposit_amount
It’s true.
You want a quote from Ben-Naim?
Why can’t you defend your claims? 🙂
Mung,
Do the words “before” and “after” have anything to do with time, in your opinion?
Mung:
I deposited it, doofus. The change in balance due to my deposit was $10,000, and I can say that even if I don’t know the balance.
Are you drinking tonight, by any chance?
Sure. But if you want to show that entropy is a function of time you’ll need more than that. As time changes, entropy changes as a function of time … How? You might want to revisit your arguments against energy dispersal.
Why would you even be pursuing this? Have you actually thought this out previously or do you simply think I must be wrong because … ?
So you merely think that you deposited 10k. got it.
How do you know there was a change in your balance? How to you know how much of a change to your balance there was?
Without actually checking.
Faith?
Yes, I’ve had some wine. So? I still know how to check my balance. 🙂
Chill dude. “I know my balance changed because I made a deposit” leaves a lot to be desired. Most people know by checking their balance after the transaction.
Smart people understand that B + ΔB – B = ΔB .
Perhaps we should continue this discussion after your BAC drops back down.
Hi keiths,
Have you revised your opinion yet about what constitutes a process in thermodynamics? Do you still think that if the macrostate does not change there is no process? Do you still think that the “mixing” of indistinguishable particles is not a process because there’s no change in entropy?
Practical people check their B, especially if they are depositing 10k. The flip side of this is, of course, that if I withdraw 10k, I actually count the money, I don’t rely on the balance statement.
Hi keiths,
Have you revised your opinion yet about whether Shannon information is information in the colloquial sense? Do you still think that Claude Shannon gave us a way to measure any and all information?
And perhaps you should provide some evidence that I am being irrational.
balance_after_deposit – balance_before_deposit = deposit_amount
p.s. Please explain how the balance changes with time, except when the balance does not change with time, because the balance is a function of time.
Reviewing Sal’s comments, I found another mistake of his that I haven’t yet addressed:
Sal claims, following Lambert, that entropy is a measure of energy dispersal. Where is the change in dispersal here? In both cases, all of the system’s energy is concentrated in a single helium atom. The number of possible microstates has changed — there are fewer locations for the atom to occupy — but energy dispersal remains the same.
Entropy changes, but energy dispersal doesn’t. They are not the same thing.
fifth,
keiths:
fifth:
Huh? That does not follow. At all.
Keiths and DNA_jock are proponents of framing entropy in terms of ignorance of the observer. I think that is a convoluted way of looking at the issues. I pointed out in the OP:
DNA_jock and Keiths pretend I didn’t even write those words, nor even connect Boltzmann to information theory as I did in 2015:
But many real world measurements of entropy are done with thermometers and calorimeters. That’s why DNA_jock and Keiths can’t compute the change in entropy of a 20 gram melting ice cube using their non-existent ignorance meters.
One can google “ignorance meter” and about the only substantive hit is this one. Not much good for making entropy measurements is it. It does however highlight the folly of observer-dependent measurements in trying to make objective measures of entropy:
I thought you were analogizing your checking account to the universe–which is an isolated system. You’d have to be inside your checking account, to make this analogy work, wouldn’t you?
I think you’re missing the quibbling aspect of this dispute. Suppose you were interested in measuring the difference in the weight of an object as measured with something made by a middle-schooler with marked cardboard, string and a potato, and as measured with a fancy digital weighing instrument used at MIT. When you calculate the difference–are you quantifying a difference in weight? Yes. Is it also a measure of the difference in information? Yes.
Sal,
I think we can agree that if you were trying to determine the entropy of a system under the missing information view, then you would go looking for an “ignorance meter”. Fortunately, the worth of an idea isn’t determined by its use in the hands of the incompetent.
Someone who actually understands thermodynamics would recognize that thermometers, calorimeters, etc., give us information about a system — information that can be used to specify a macrostate — and that entropy is a measure of the gap between the information contained in the macrostate and the information needed to specify the exact microstate.
Even you seem to get that at least some of the time. For example, you’ve written:
The problem is that you can’t keep your position straight. Elsewhere you’ve expressed agreement with Elzinga’s assertion that entropy has nothing to do with information, and here you’re pretending that the missing information view depends on the existence of “ignorance meters”.
Which is it? Make up your mind, please.
(Hint: walto was right to abandon dispersalism and embrace the “missing information” view, painful though that must have been. The dispersalist position is hopeless, for reasons that have been pointed out again and again in this thread, but no one has produced even a single scenario in which the “missing information” interpretation fails.)
walto,
The checking account balance is analogous to the total entropy of the universe.
If I make a deposit of ΔB dollars, I know that the account balance B changes by ΔB dollars, even if I don’t know what B is.
Likewise, if a thermodynamic process generates an entropy change of ΔS, I know that the effect of that process on the total entropy S of the universe is ΔS, even if I don’t know the value of S.
You are probably right, but it’s probably because there is so much more to thermodynamics than the issues about information that I missed the quibble.
Let me give another angle to this as it relates to colewd’s quote of Einstein. One can interpret Boltzmann:
S = kB ln W
as measure of our ignorance about the details of a system. I already said I don’t like this interpretation of Boltzmann. Even though the information theoretic description might be shown to be formally correct, it is a convoluted way of looking at things.
It can also be shown that it is formally correct to say entropy is simply a scaled logarithmic measure of microstates. This would be a less convoluted way of interpreting Boltzman — where microstates are the set of coordinates for all particles in terms of (X, Y, Z, P_x, P_y, P_z). The microstate description is more rigorous and less deepity woo than the information theoretic description.
But if we say we lack information or are ignorant, I’m tempted to say “what else is new Captain Obvious!” What is important from a chemical and engineering standpoint is that the measure of our ignorance and lack of information in principle is an indicator of how energy will flow in car engines and chemical reactions.
There is observer-dependent information, and there is observer-INdependent information. When there is observer-independent information, we may as well dispense with information theory altogether.
When a chem student can calculates the change in entropy in a melting ice cube without information theory, then practically speaking, imho, information theory is a gratuitous superficial add-on. It’s not formally wrong, but of little utility.
Taking your example with thermometers. Say we have a student with an old mercury thermometer that is hard to read vs. one that is high end digital. We have the inherent belief that when there is a change of entropy in a melting ice cube, there is a real change of entropy and there is a “right” answer for the change in entropy as far how much heat was involved in the process (going from ice to liquid).
There maybe difference in information due to our instrument errors. BUT the ignorance due to our instrument errors is different than the “ignorance-in-principle” or the observer-INdependent information of the system.
What do I mean by observer-INdependent information?
I said hypothetically, a 20 gram ice cube melted to liquid water, it has a change of entropy of:
6660 J/ 273 K = 24.39 J/K = 24.39 J/K / kB = 24.39 J/K / (1.381x 10^-23 J/K) =
1.7665 x 10^24 natural bits or
1.7665 x 10^24 x (1/ ln(2) Shannon Bits/ natural bits) =
2.5485 x 10^24 Shannon bits
Everyone, independent of their lab equipment will say the amount of information in each microstate, or the amount of ignorance we have in the system, should objectively calculate the same figure for the hypothetical scenario given that we are using idealized thermometers and calorimeters. We have defined objectively what are the limits of our knowledge, namely restricted to idealized thermometers and calorimeters.
We may have different actual instrument measurements, but we agree on what the objective value ought to be if we had accurate instruments. The ignorance and information is a matter of principle, not observer dependence.
The problem is people can conflate their subjective information with the hypothetical objective information for a given process.
I was just harping on Keiths, that his ignorance approach to entropy is almost a Captain Obvious approach. Unless this the quantified ignorance is tied to how energy behaves in a system, for real world applications it’s a near useless fact. As I’ve tried to show, one cannot even calculate the bits of ignorance in the first place without energy temperature data such as in the melting ice cube example.
Hence, the for many practical applications entropy, information theory is not of immediate relevance. It should be worth noting Clausius definition of the 2nd law which motivated his definition of entropy (a non-rigorous paraphrase):
The slightly more rigorous statement by Clausius himself:
Clausius defined entropy according this version of the 2nd law. It is worth noting how energy-centric and temperature-centric this statement of the 2nd law is, and it is worth noting Clausius definition of entropy is energy-centric and temperature-centric as well!
The Boltzman/Shannon formulation of entropy are nice academic exercises in quantifying lack of information or abundance of ignorance, but as colewd’s quote of Einstein suggested, without connection of these information metrics to heat and temperature:
S = kB ln W
it borders on being a not very practical theory. To the extent it is can be connected to heat and temperature and thus the 2nd law (which is among the most fundamental laws), then it becomes useful and insightful.
The value of Boltzmann was that he connected (approximately) the phenomenon of heat and temperature to the statistical properties of lots of billiard balls we call molecules that are approximately Newtonian in behavior. Boltzmann and Gibbs lucked out that this collective Newtonian idealization of molecules is pretty darn close to the more accurate Quantum mechanical description.
stcordova,
As you can see, there is a lot of physics and cosmology that developed based on Boltzman’s equations. In addition to Hawking, Pengrove and Suskin (string theory) developed theories based on Bekenstein’s work. Moral: we can’t just take these big over arching theories for granted because they can mislead science.
On the one hand, Sal is telling us that the connection between entropy and information is “obvious” and “formally correct”.
On the other, he agrees with Mike Elzinga that entropy has nothing to do with information and has characterized the missing information view as “deepity woo”.
Has his account been hijacked, or is he really this confused?