In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Walto, for whom I have affection, is out of his depth in this discussion.

    Fess up, Walto. You’ll feel so much better.

  2. I’ve had that feeling also. If it were keiths against all the scientifically literate members, I’d hesitate to take his side. But it’s more like Walto against people who have taken graduate level courses in the subject.

  3. petrushka:
    I’ve had that feeling also. If it were keiths against all the scientifically literate members, I’d hesitate to take his side. But it’s more like Walto against people who have taken graduate level courses in the subject.

    I hitched my wagon to Prof. Lambert’s star. But that was a trap, apparently. Several guys here disagree with him. And some may have taken a graduate level course in thermodynamics. 🙁

    I haven’t claimed to be an expert. What’s the interest or importance in convincing me?

  4. Pedant:

    Walto, for whom I have affection, is out of his depth in this discussion.

    Fess up, Walto. You’ll feel so much better.

    Pedant is right, walto. Why not give it a shot?

  5. walto,

    Sal bailed out of the discussion rather than admitting that he couldn’t defend the dispersalist position. Surely you can do better than Salvador Cordova, of all people.

    Do you think Lambert’s position is defensible? Can you refute my six points against dispersalism and answer this question?

    If so, please do so. If not, why not take Pedant’s advice and “fess up”?

  6. Keith, my concerns with the information paucity theory have little to do with your six points. I’ve mentioned those concerns several times. Could I be mistaken? Absolutely–perhaps they’re confused and easily handled. As indicated, I’m far from an expert on this issue. But i’m inclined to doubt that the repetition of remarks that have little or nothing to do with my doubts about the information claim will have much effect on me.

    My questions, about possibility spaces, degrees of freedom, the apparent inconsistency of 2LOT with a macrostate looked at from a Laplacian genius’s perspective, utility for physical sciences, etc. are simple and if there was some interest and ability to answer them, exhibited here, maybe I could be convinced. (Maybe Joe F. could be too; heck, maybe you could win over Lambert for all I know)! But if convincing people is your goal–rather than WINNING some argument I don’t care much about–you’ll probably have to respond to questions and concens rather than just repeat stuff you like to say. You’ve said you don’t want to do that. And it’s true you’re under no obligation. C’est la vie. But it’s weird that you fault me for the result of your choices.

    Again. I’m not sure why anyone should want to convince me of this, any more than I understand why FMM should want my position on Molinisn v Arianism (if those are spelled right). Maybe it’s a compliment. If so, thanks. But neither are areas I’m either knowledgable about or particulalrly interested in. In this case, In fact, I think the dispute is a quibble. Anyhow, I suggest you turn your sites on Joe. He’s a scientist and I’m not, so perhaps there’s more at stake there. Dunno, but if so, go argue with him.

  7. If I am to be the designated audience who is to be convinced, the way to do so is to answer one question I have: why does it matter whether this quantity or that quantity is the one to be called “entropy”?

    Does naming matter? Is there some scientific consequence, in the sense that we can make some prediction from knowing that the “entropy” is X? Or is the “entropy” something that merely has to end up being consistent with what we would predict, the prediction being made by other means?

    Clearly there is something I’m missing here.

  8. Joe Felsenstein,

    I’ll take a stab at this from my admittedly low vantage. Entropy was considered a measure of “disorder” for a long time. But it came to be noticed that nobody could actually be measuring that as if it were independent of some specific description of the state and some specific concept of “order”: probabilities being what they are.

    That conclusion seemed to make the quantity subjective. So those using it for their everyday work looked around for what (if not disorder) it could be that’s always increasing if not disorder and figured that it might be some sort of energy dispersal. THAT must be what they’d actually been measuring.

    However, as keiths’ and Jock argue in this thread, there are problems with taking as dispersal of anything that which is being gained by physical systems as these macrostates get older. (Sal disagrees. I’m not in a position to opine myself on whether the dispersal theory can withstand those criticisms.) In any case, those critics, with the support of Shannon’s information theory, have returned to something more like disorder–something that is probability based, and thus relative to some specification of knowledge of what is happening.

    That move has pissed off Lambert and others for the reasons I have quoted above. He believes that, when the term is made useful for chemists, engineers and others, there has been a cheat: the objective physical properties he’s measuring have been pumped into the ‘missing information’ calculations, making entropy a two-headed thing. If they hadn’t been pumped in, he claims such calculations would be useless for physical sciences even if useful for information science (and, I think, 2LOT would be false, as it seems clearly to be from “Damon’s perspective).

    So, yes, it’s kind of a quibble, since both schools get the same result. But, as keiths has rightly urged, it’s not entirely without substance, and is certainly not the sort of thing that either school will give up. One side has Boltzmann’s equation, the other has working chemists. And so the fight continues.

  9. Joe Felsenstein:
    Is there a case where the one decreases but the other increases?

    I think that would happen only if you stripped the (I’ll call it) “energy dispersal aspect” from the the equations the information theory relies on. I think you can see how it’s snuck in from the Ben-Naim quotes above.

  10. BTW, for those with an abiding interest in THE SIX POINTS, I think what Lambert’s general response to them would be can be sussed out from this:

    http://entropysite.oxy.edu/entropy_isnot_disorder.html

    IMPORTANT NOTE: The initial statement above (about the movement of molecules into a larger volume as solely due to their motional energy) is only valid as a description of entropy change to high school students or non-scientists when there will not be any further discussion of entropy.[…] Entropy change is certainly enabled in chemistry by the motional energy of molecules (that can be can be increased by bond energy change in chemical reactions) but thermodynamic entropy is only actualized if the process itself (expansion, heating, mixing) makes accessible a larger number of microstates

    […]

    Entropy change is measured by the dispersal of energy: how much is spread out in a process, or how widely dispersed it becomes — always at a specific temperature

    Emphasizing that entropy is about available microstates in the state (ie phase) space that belong to the macrostate, and not simply about volume occupied, addresses issues with mixing. Defining entropy as energy dispersal at fixed temperature answers issues related to the units of entropy being J/K (or dimensionless if temperature is defined to have units J).

    This is more a matter for Sal (who seems to have ditched) than me, however. Not only am I inexpert, but, as I’ve said, my concerns are, like Joe’s, to a large extent orthogonal to the specific quantity Lambert claims scientists are measuring.

  11. petrushka: But it’s more like Walto against people who have taken graduate level courses in the subject.

    Grade school level course. I didn’t mean to mislead anyone. Sorry if I did.

  12. If someone could ask a simple question, I could probably give a simple answer. 🙂

    For one thing, the information view of entropy helps us understand that the second law does not prohibit highly improbable configurations. I don’t see how that is a benefit to creationists. A configuration may be highly unlikely, our likelihood of ever seeing that configuration may be near to nil, but that doesn’t mean it is physically impossible.

    That doesn’t mean there is no second law, it’s just that it has to be understood probabilistically. But it still has a physical basis.

  13. Mung: If someone could ask a simple question, I could probably give a simple answer

    OK, How can the total entropy of an isolated system always decrease over time if the isolated system (or macrostate) is specified by one who knows all the laws of physics and all aspects of the microstates involved in every macrostate? Isn’t the claim, that is, that, for Damon, the entropy of every system is zero a reductio of the information theory because it is inconsistent with 2LOT?

  14. walto: OK, How can the total entropy of an isolated system always decrease over time

    Haha. Sorry, I meant “increase”!

  15. walto,

    OK, How can the total entropy of an isolated system always decrease [increase] over time if the isolated system (or macrostate) is specified by one who knows all the laws of physics and all aspects of the microstates involved in every macrostate? Isn’t the claim, that is, that, for Damon, the entropy of every system is zero a reductio of the information theory because it is inconsistent with 2LOT?

    There’s no 2LOT violation, even for Damon, because the 2LOT only requires that the change in entropy be greater than or equal to zero for an isolated system.

    The 2LOT is very boring for Damon, but it isn’t violated.

  16. walto,

    BTW, for those with an abiding interest in THE SIX POINTS, I think what Lambert’s general response to them would be can be sussed out from this:

    <snip Lambert quote>

    Nothing in that excerpt refutes any of my six points.

    If you disagree, let’s hear some specifics.

  17. keiths,

    I thought that can only happen at a temp of absolute zero. That need not be the case for Damon on your view, right? Wouldn’t that require a reinterpretation…or a complete re-write?

  18. walto,

    I thought that can only happen at a temp of absolute zero. That need not be the case for Damon on your view, right? Wouldn’t that require a reinterpretation…or a complete re-write?

    The entropy can only be zero for us when the temperature is absolute zero, because that’s the only case in which we can know the exact microstate.

    The entropy is always zero for Damon, at any temperature, because he always knows the exact microstate.

    What’s true for everyone is that the change in entropy of an isolated system is greater than or equal to zero.

  19. I’ve said my piece on your SIX POINTS. I leave it to more competent readers to agree with you or Lambert as they wish. It’s neither my fight nor my field; I just speculated on how I think he might respond and indicate why I thought such a response was apposite in my post.. If his response is insufficient, so be it. You and pedant and petrushka can decide the matter just as you like.

  20. keiths: The entropy can only be zero for us when the temperature is absolute zero, because that’s the only case in which we can know the exact microstate.

    The entropy is always zero for Damon, at any temperature, because he always knows the exact microstate.

    That’s all question-begging, as I hope you can see.

  21. walto,

    I’ve said my piece on your SIX POINTS. I leave it to more competent readers to agree with you or Lambert as they wish. It’s neither my fight nor my field; I just speculated on how I think he might respond and indicate why I thought such a response was apposite in my post.. If his response is insufficient, so be it. You and pedant and petrushka can decide the matter just as you like.

    That excerpt from Lambert doesn’t even address my six points, much less refute them. To rescue dispersalism, you’d need to refute all six.

    One thing I hope we agree on is that you, walto, cannot refute any of the six points. Agreed?

  22. At least half of them beg the question in favor of your theory. I certainly can’t “refute” those. Maybe not the others either, but…why should I want to? And again, why is that important to you? Who the fuck am I?

  23. keiths:

    The entropy can only be zero for us when the temperature is absolute zero, because that’s the only case in which we can know the exact microstate.

    The entropy is always zero for Damon, at any temperature, because he always knows the exact microstate.

    walto:

    That’s all question-begging, as I hope you can see.

    I think I can see where some of your confusion is coming from. Earlier, you wrote:

    One side has Boltzmann’s equation, the other has working chemists. And so the fight continues.

    That’s a misconception. Everyone agrees on the equations — disorderists, dispersalists, informationists alike — including Lambert. The Boltzmann entropy equation is no exception.

    The Boltzmann equation and the Gibbs equation are not under dispute. They tell you that Damon will always see an entropy of zero, regardless of temperature, because he knows the exact microstate.

  24. walto,

    At least half of them beg the question in favor of your theory. I certainly can’t “refute” those.

    They don’t beg the question. See above.

    Maybe not the others either, but…why should I want to?

    Because you’ve been defending the dispersalist view. My six points show that it cannot be correct.

    If you can’t refute them, why are you still a dispersalist?

  25. walto,

    Here’s what I think you’re missing:

    My six points against dispersalism are just that — six points against dispersalism.

    They do not assume the truth of the “missing information” interpretion.

  26. walto,

    And again, why is that important to you? Who the fuck am I?

    You’re just a guy on the Internet, and so am I. We’ve been having a discussion.

    For days, you’ve been defending the dispersalist view of entropy and attacking the informationist view. I’ve been doing the opposite. Others have been following our discussion.

    To bail out at this point, as Sal did, is a disservice to those readers. Let’s see if we can resolve the dispute(s) or at least pinpoint where our disagreements lie.

  27. keiths: Others have been following our discussion.

    That might even be true. I, for one, have been glancing in, wondering whether I might learn something about entropy, what it is and what its significance is.

  28. keiths, to walto:

    For days, you’ve been defending the dispersalist view of entropy and attacking the informationist view. I’ve been doing the opposite. Others have been following our discussion.

    To bail out at this point, as Sal did, is a disservice to those readers. Let’s see if we can resolve the dispute(s) or at least pinpoint where our disagreements lie.

    Alan:

    keiths: …resolve the dispute…

    Is that important? Who is right?

    Not sure how you got from my statement to yours. To resolve the dispute, we need to identify which view is correct (if either). That has the side effect of establishing who is right and who is wrong, but that is not the purpose of resolving the dispute.

  29. Joe,

    If I am to be the designated audience who is to be convinced, the way to do so is to answer one question I have: why does it matter whether this quantity or that quantity is the one to be called “entropy”?

    This is not a case of two quantities competing for the name “entropy”. Everyone agrees that thermodynamic entropy is the quantity given by the Boltzmann and Gibbs equations.

    The disagreement is over the interpretation of that single quantity.

    The disorderists claim that it measures disorder, the dispersalists claim that it measures energy dispersal, and the informationists say it’s a measure of missing information.

    Does naming matter? Is there some scientific consequence, in the sense that we can make some prediction from knowing that the “entropy” is X?

    There’s no consequence for scientific calculations, because everyone agrees on the equations. There are major consequences for scientific understanding, however.

    The “entropy is disorder” misconception underpins the creationist misuse of the 2LOT, with Granville Sewell as the poster child. And even among non-creationists, you’ll continually hear people claiming that a messy room has more entropy than a tidy one, and that the 2LOT guarantees that tidy rooms will naturally become messy if no one intervenes.

    The dispersalist view is something of an improvement over the disorderist view, but it’s still wrong. If you hew to the idea that entropy is a measure of energy dispersal, you’ll misidentify which systems do and do not increase their entropy. The gas mixing case we discussed earlier in the thread is a perfect example of this. I’ll post a separate comment addressing it again.

    I’m not aware of any cases in which the “missing information” interpretation of entropy fails. That’s why I’ve been defending it in this thread.

  30. keiths: 5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

    That stuff is entirely question-begging–it assumes the position you’re supposed to be arguing for. The earlier ones may not be. I leave them to Sal…or anybody else who is interested.

  31. walto,

    That stuff is entirely question-begging–it assumes the position you’re supposed to be arguing for.

    Physicists and chemists, including Lambert, agree that there is an increase in entropy when distinguishable gases are mixed. This is not controversial.

    What Lambert et al don’t realize is that distinguishability is observer-dependent. In my thought experiment, Yolanda can distinguish between the isotopes because she has the right equipment. Xavier, who lacks that equipment, cannot.

    Yoland sees an entropy increase, while Xavier does not. They are both correct.

    As I wrote:

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    With that in mind, ponder DNA_Jock’s thought experiment from earlier in the thread (addressed to Sal):

    Let’s repeat this experiment a million times, but with balls that are dark red and light red. But in each of the experiments, we vary the hue of the balls ever so slightly.
    The entropy of mixing does not change at all, it is always 10.85 J/K, except when the balls are exactly the same color. At this one moment, the entropy of mixing suddenly jumps to ZERO.
    The question I have been asking you repeatedly, and you have (as usual) not even tried to answer, is WHY is this the case under your “spreading” definition? WHY is there this sudden plummet in the mixing entropy when we go from red balls with ever-so-slightly darker red balls to indistinguishable balls. You’ll notice that it follows automatically from keiths’s definition. Your definition seems sadly lacking in this regard.

    walto:

    The earlier ones may not be [question begging]. I leave them to Sal…or anybody else who is interested.

    Any one of my points is sufficient to refute dispersalism. If you can’t refute them, then why are you still a dispersalist?

  32. keiths: What Lambert et al don’t realize is that distinguishability is observer-dependent. In my thought experiment, Yolanda can distinguish between the isotopes because she has the right equipment. Xavier, who lacks that equipment, cannot.

    Yoland sees an entropy increase, while Xavier does not. They are both correct.

    That conclusion is ENTIRELY QUESTION-BEGGING. (I’m hoping the capital letters will help that sink in.)

  33. walto: That conclusion is ENTIRELY QUESTION-BEGGING. (I’m hoping the capital letters will help that sink in.)

    Which question does it beg?

    1. Whether the particles are indistinguishable or not?

    2. Whether distinguishable particles result in an entropy increase?

    3. Whether indistinguishable particles do not result in an entropy increase?

    4. The reason that distinguishable particles result in an entropy increase while indistinguishable particles do not result in an entropy increase?

    5. Something else…

  34. walto,

    That conclusion is ENTIRELY QUESTION-BEGGING. (I’m hoping the capital letters will help that sink in.)

    Perhaps I should return the favor:

    PHYSICISTS AND CHEMISTS, INCLUDING LAMBERT, AGREE WITH ME ON THE THE NECESSITY OF TAKING DISTINGUISHABILITY INTO ACCOUNT WHEN COMPUTING THE ENTROPY CHANGE CAUSED BY THE MIXING OF TWO GASES.

    Now let’s hear your answer to DNA_Jock’s question:

    The question I have been asking you repeatedly, and you have (as usual) not even tried to answer, is WHY is this the case under your “spreading” definition? WHY is there this sudden plummet in the mixing entropy when we go from red balls with ever-so-slightly darker red balls to indistinguishable balls. You’ll notice that it follows automatically from keiths’s definition. Your definition seems sadly lacking in this regard.

  35. I don’t know what you’re talking about. I haven’t given any definition. I’ve simply pointed out that your #5 and #6 beg the question at issue. And they do.

    A lot of red herrings in your posts on this thread!

  36. So what can we agree on? 🙂

    Does everyone agree that we are in a maximum state of uncertainty when the outcome of an event (or series of events) is equally probable, when tossing a fair coin, a fair dice, or trying to decide where the pea is?

    Increasing the number of coins doesn’t change the basics, but what it does is make it ever more and more unlikely that the distribution of heads and tails will deviate from some number close to 50/50.

    I like concrete examples. Examples anyone?

  37. In this book, we will reserve the term “entropy” for the quantity defined in thermodynamics. The quantity derived by Shannon will be referred to as Shannon’s measure of information (SMI). In Section 2.2 we will see how the thermodynamic entropy may be obtained as a special case of SMI. It should be stressed, however, that neither entropy nor SMI is information in it’s colloquial sense.

    – Ben-Naim, Arieh. Information, Entropy, Life and the Universe. p. 37.

  38. walto,

    I don’t know what you’re talking about. I haven’t given any definition.

    Come on, walto.

    You and Sal have been defending the dispersalist view — at least until Sal bailed out on you so courageously — and so of course DNA_Jock is talking about the dispersalist definition of entropy.

    You even quoted it yourself, just a short while ago:

    Entropy change is measured by the dispersal of energy: how much is spread out in a process, or how widely dispersed it becomes — always at a specific temperature.

    With that definition in mind — the one you quoted — how would you answer DNA_Jock?

    Let’s repeat this experiment a million times, but with balls that are dark red and light red. But in each of the experiments, we vary the hue of the balls ever so slightly.
    The entropy of mixing does not change at all, it is always 10.85 J/K, except when the balls are exactly the same color. At this one moment, the entropy of mixing suddenly jumps to ZERO.
    The question I have been asking you repeatedly, and you have (as usual) not even tried to answer, is WHY is this the case under your “spreading” definition? WHY is there this sudden plummet in the mixing entropy when we go from red balls with ever-so-slightly darker red balls to indistinguishable balls. You’ll notice that it follows automatically from keiths’s definition. Your definition seems sadly lacking in this regard.

  39. Joe,

    As promised, some comments regarding the gas-mixing case.

    Assume an experimental setup like the ones we talked about earlier in the thread, where there are two chambers of the same volume on the left and right. There is a removable partition between them. The left and right chambers contain identical numbers of gas molecules, and the temperatures of both chambers also match.

    We’ll be talking about two cases. In one, the gas in the left chamber can be disintinguished from the gas in the right. In the other, the gases are the same and therefore indistinguishable.

    Here’s an illustration of the first case:

  40. Case 1:

    The gases are distinguishable, in this case via “color”. When the partition is removed, they will spontaneously begin to mix. “Blue” molecules will diffuse into the left chamber, and “red” molecules will diffuse into the right.

    You’ll end up with something like this:

  41. Has the entropy increased? Yes, and you can infer that as follows:

    There are vastly more ways of having a mixture of red and blue molecules on both sides (the “After” state), and vastly fewer ways of having all red molecules on the left and all blue molecules on the right (the “Before” state). “Ways” is just another word for “microstates” here, so the number of possible microstates increases dramatically from “Before” to “After”.

    The number of possible microstates is represented by W in the Boltzmann equation:

    S = kb ln W

    Since W gets dramatically bigger when going from Before to After, the entropy S also increases (but less dramatically, because it is a logarithm).

  42. Case 2:

    The gases are indistinguishable; let’s say the molecules are all “blue”. When the partition is removed, they will spontaneously begin to mix. “Blue” molecules will diffuse into the left chamber, and “blue” molecules will diffuse into the right.

    You’ll end up with something that looks just like what you started with: blue molecules on both sides. Nothing has changed. The number of ways of having blue molecules on both sides (the Before state) is the same as the number of ways of having blue molecules on both sides (the After state) — obviously.

    Since the number of possible microstates is the same, the entropy remains unchanged.

    What made the difference? The fact that in Case 1, the gases were distinguishable, while in Case 2, they were not.

    Did the energy disperse differently? No, because the underlying physics doesn’t care about the “color” of the molecules.

    So there’s no difference in energy dispersal between Case 1 and Case 2, but there is a difference in entropy. They can’t be the same thing.

    Entropy is not a measure of energy dispersal.

    The dispersalist definition below is incorrect:

    Entropy change is measured by the dispersal of energy: how much is spread out in a process, or how widely dispersed it becomes — always at a specific temperature.

  43. walto,

    You keep claiming that I’m begging the question.

    Do you seriously doubt the correctness of the Boltzmann and Gibbs equations? Do you understand that dispersalists, including Lambert, accept those equations?

  44. Here’s a suggestion (and a challenge).

    Anyone interested in advancing understanding about entropy please submit a comment stating in everyday terms (as if explaining to a bright child) what entropy is.

  45. keiths: Entropy is not a measure of energy dispersal.

    You see, this is not helpful. Tell us what entropy is. Surely it is quicker than listing all the things that it isn’t.

  46. keiths: o you seriously doubt the correctness of the Boltzmann and Gibbs equations? Do you understand that dispersalists, including Lambert, accept those equations?

    And why not just assume we don’t know and supply your answer?

Leave a Reply