In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. stcordova: The microstates of a system in Boltzmann’s famous formula is defined by the set of positions and momenta of each particle. If the volume available to a species of gas is expanded, the number of position/momentum microstates increases because the number of possible positions increases.

    Why not just speak in terms of probabilities, like Boltzmann?

    “…it rapidly proceeds to the … most probable state.”

    Or do you also disagree with Boltzmann about this?

  2. DNA_jock:

    WHY, under your “dispersal of energy” definition, does the entropy of mixing depend on whether the gases can be distinguished?

    Figure it out yourself. I provided plenty of references if you’re willing to look.

    And it’s not MY “dispersal of energy” definition, it was provided by an emeritus professor of Chemistry, Frank Lambert, who is helping change chemistry textbooks which is obviously more than I can say for your obvious lack of expertise on the topic as evidenced by your question.

    WHY, under your “dispersal of energy” definition, does the entropy of mixing depend on whether the gases can be distinguished?

    But maybe I’ll answer for the reader’s benefit lest they think DNA_jock has a point..

    The necessity of the gases being distinguishable for entropy of mixing to emerge will hold under any definition of entropy used by scientists.

    DNA_jock can’t even get the basics correct even when you’re spoon fed references from several links and given notions like the mixing paradox that are in many physics books.

  3. For the reader’s benefit, see this lecture by at Georgia Tech. 9:06 into the lecture on entropy of mixing. He explains the change of entropy in terms of dispersal:

    Watch it and weep Keiths. You can’t pin the notion on me that I hold some sort of deviant view of entropy that doesn’t appear in universities. Lambert is not some outcast, and neither are those teachers using the dispersal description of entropy.

    Keiths:

    Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix, but entropy most certainly increases.

    Yes it is dispersed if you are willing to try to understand what is meant by dispersal of energy in the case of gases mixing. It is the dispersal of energy associated with each species individually, not the dispersal of energy on the whole mixed solution.

    Why don’t you watch the chemistry lecture and learn something rather than remain in a condition of persistent denial that such that you have to rely on misreading and redefining the meaning of phrases by teachers of chemistry like Lambert just so you can save face.

    Read the following slide from yourself. The slide says: “kinetic energy is dispersed over a larger volume.” And DNA_jock robotically disagrees with me by backing you. Take that DNA_jock, hahaha!

  4. Sal:

    Read and weep…Watch it and weep Keiths…Why don’t you watch the chemistry lecture and learn something rather than remain in a condition of persistent denial… And DNA_jock robotically disagrees with me by backing you. Take that DNA_jock, hahaha!

    You’re only worsening your humiliation with rhetoric like this, Sal. Think of how it will feel when you finally realize that you’re wrong and that your taunts are all backfiring on you.

  5. Sal,

    You can’t pin the notion on me that I hold some sort of deviant view of entropy that doesn’t appear in universities.

    I haven’t claimed that. I’ve simply pointed out that you’re wrong.

    Lambert is not some outcast, and neither are those teachers using the dispersal description of entropy.

    I haven’t claimed that. I’ve simply pointed out that you and they are wrong.

    Read the following slide from yourself. The slide says: “kinetic energy is dispersed over a larger volume.” And DNA_jock robotically disagrees with me by backing you. Take that DNA_jock, hahaha!

    Because it’s simply not possible for someone from Georgia Tech to make a mistake?

  6. Sal:

    If Gas A is helium and Gas B is neon, does the entropy increase after they mix? Answer: Yes.

    Is there a change in average energy per unit volume in the system even though there is a change in entropy? Answer: No.

    keiths:

    Take a look at your second sentence. You are making my point for me.

    Sal:

    The additional entropy from mixing does not depend on the character of the gases; it only depends on the fact that the gases are different.

    DNA_Jock:

    Precisely. Thank you for making my point for me.

    Sal is good at shooting himself in the foot. We just need to get him to look down and notice the blood.

  7. Sal:

    Yes it is dispersed if you are willing to try to understand what is meant by dispersal of energy in the case of gases mixing. It is the dispersal of energy associated with each species individually, not the dispersal of energy on the whole mixed solution.

    That’s not what Lambert’s definition — which you’ve enthusiastically endorsed — says:

    Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world.

    There’s nothing in there about ‘species’, and for good reason: Lambert, unlike you, realizes that he needs a general definition, not one that applies solely to mixing gases.

    His mistake was to pick the wrong general definition.

  8. Kieths,

    Apparently you’re pretty resistant to learning. So here is the video lecture again on mixing entropy that uses the energy dispersal description.

    Here again is the relevant slide. You’ve not even bothered to try to criticize the numbers in the examples above nor their implications.

    Do you not see where it says, “kinetic energy is dispersed in a larger volume!”? It’s right there in the lower right with the picture of blue molecules and pink molecules. The kinetic energy of the pink molecules is dispersed because the pink molecules themselves are dispersed. Same for the blue molecules. Why is that so hard for you to comprehend?

  9. Sal,

    The kinetic energy of the pink molecules is dispersed because the pink molecules themselves are dispersed. Same for the blue molecules. Why is that so hard for you to comprehend?

    Judging by the timestamps, our posts crossed in the ether. I address the ‘species’ issue here.

  10. Sal,

    Do you not see where it says, “kinetic energy is dispersed in a larger volume!”?

    Well, if the slide says so, I guess it must be true. Slides on the Internet are never wrong, much less those from Georgia Tech.

    Pardon my chutzpah. Perhaps you should append “As seen on TV the Internet!” to your claims, so that I’ll know not to challenge them in the future.

  11. Sal,

    Here’s an easy way to see where you went wrong. You presented two thought experiments.

    In one, gas A is the same as gas B. When they mix, the entropy does not increase because the gases are indistinguishable.

    In the other, gas A is different from gas B. When they mix, the entropy does increase because the gases can be distinguished.

    Yet in both cases, the energy from the right side disperses into the left side, and vice-versa.

    Energy dispersal is happening in both cases. Entropy is increasing in only one. Therefore, entropy is not the same as energy dispersal.

    Lambert’s definition is incorrect. You backed the wrong horse.

  12. Just to hammer the point home, energy dispersal is always happening in your thought experiments. At any time t, note which molecules are in the right half of the box and which are in the left. As t increases, the molecules in the right side of the box will diffuse into the left side, and vice-versa.

    By the energy dispersal definition, entropy should always be increasing in such a system. It does not.

    What’s crucial is the distinguishability (or lack thereof) of the molecules, not the presence or absence of energy dispersal. Entropy is a measure of our lack of detailed knowledge of a system, not of the dispersal of energy.

  13. Just to triple-hammer the point home:

    To God, or a Laplacean demon, or anyone who knows the exact microstate of a system, the entropy is exactly zero.

    Why? Because for such an observer, W, the number of possible microstates, is one, and the logarithm of one is zero. The Boltzmann entropy expression k ln W therefore gives a result of zero.

    The energy dispersal, or lack thereof, is the same. Entropy is a measure of our lack of knowledge of a system’s exact microstate. It is not a measure of energy dispersal.

  14. Keiths seems to be under the mistaken impression that “dispersal of energy” wasn’t a description that emerged from formal treatment of statistical mechanics and thermodynamics. He thus presumes it is a misconception.

    Here is a Book on Statistical Thermodyanmics by MC Gupta.
    Statisitcal Thermodynamics, MC Gupta

    Thus the concept of entropy is vital for thermodynamics. It is a Greek word and stands for Trope (Greek word meaning change) and En is suffixed to indentify it with energy. It is a function which measures how the dispersal of energy occurs when a system changes from one state to another.

    Statistical Thermodynamics
    page 62

    The idea traces its roots to Lord Kelvin who wrote:
    “On a Universal Tendency in Nature to the Dissipation of Mechanical Energy.”

    Dissipation? As in (ahem) dispersal? 🙂

    And from American Chemical Society (ACS), Lambert’s article:

    http://pubs.acs.org/doi/abs/10.1021/ed079p1241?journalCode=jceda8

    Qualitatively, entropy is simple. What it is, why it is useful in understanding the behavior of macro systems or of molecular systems is easy to state: Entropy increase from a macro viewpoint is a measure of the dispersal of energy from localized to spread out at a temperature T. The conventional q in qrev/T is the energy dispersed to or from a substance or a system. On a molecular basis, entropy increase means that a system changes from having fewer accessible microstates to having a larger number of accessible microstates. Fundamentally based on statistical and quantum mechanics, this approach is superior to the non-fundamental “disorder” as a descriptor of entropy change.The foregoing in no way denies the subtlety or the difficulty presented by entropy in thermodynamics—to first-year students or to professionals. However, as an aid to beginners in their quantitative study of thermodynamics, the qualitative conclusions in this article give students the advantage of a clear bird’s-eye view of why entropy increases in a wide variety of basic cases: a substance going from 0 K to T, phase change, gas expansion, mixing of ideal gases or liquids, colligative effects, and the Gibbs equation.
    ….

    “The ‘entropy of mixing’ might better be called the ‘entropy
    of dilution’” (7a). … Ideal gases or liquids mix spontaneously
    in their combined greater volumes because then their
    original energy is more dispersed
    in the new situation of more
    dense microstates. Their entropy increases.

    –Frank Lambert

    So the energy dispersal description applies to entropy of mixing, contrary to Keiths misreading of Lambert as evidenced below:

    Keiths:

    Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix

    But that contradicts Lambert’s interpretation of his own words about energy dispersal:

    Ideal gases or liquids mix spontaneously in their combined greater volumes because then their original energy is more dispersed

    Keiths is dogmatic that his mis-interpretation of Lambert’s words should take precedence over Lambert’s interpretation of Lambert’s own words. He won’t back down until I accept Keiths’ mis-interpretation of Lambert over Lambert’s interpretation of Lambert. Too funny!

  15. Sal,

    If Lambert’s definition is correct…

    Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world.

    …then why does it give the wrong answer in the following scenario?

    Just to hammer the point home, energy dispersal is always happening in your thought experiments. At any time t, note which molecules are in the right half of the box and which are in the left. As t increases, the molecules in the right side of the box will diffuse into the left side, and vice-versa.

    By [your modification of] the energy dispersal definition, entropy should always be increasing in such a system. It does not.

    (Note the correction in square brackets. By Lambert’s definition, entropy shouldn’t increase when the gases mix, because the energy distribution remains the same before and after mixing. By your modified definition — which you haven’t acknowledged as modified — entropy should always be increasing, because the molecules in the right half of the box are always diffusing into the left half, and vice-versa. Both definitions are therefore incorrect.)

    Hint: I’m asking you for an actual counterargument. I’ve shown why your definitions are wrong: they give the wrong answers. What is your counterargument?

  16. Here is another energy spreading interpretation of mixing entropy from a Springer publication:

    http://www.cpp.edu/~hsleff/entlangint.pdf

    Mixing of Gases

    Consider two gases, each with N molecules, initially in separate but equal volumes V separated by a central partition, as shown in Fig. 4(a). The entire system has temperature T . If the partition is removed, the gases spontaneously mix together, there is no temperature change, and the entropy change is

    S = 2NkB ln 2 for distinguishable gases,

    0 for identical gases.

    For distinguishable molecules, process I yields the standard “entropy of mixing”,
    ….
    Here is an energy spreading interpretation. For distinguishable gases, energy
    spreads from volume V to 2V by each species in process I, accounting for SI = 2NkB ln 2. Note that upon removal of the partition, the energy spectrum of each species becomes compressed because of the volume increase and, most important, spreads through the entire container. That is, energy states of “black” particles exist in the right side as well as the left, with a corresponding statement for “white” particles.

    Harvey S. Leff

    Gee, that sounds like what I’ve been saying all along. Leff goes on to cover the case where the mixing occurs with no change in volume.

    But Keiths insists I must use his mis-interpreation of the energy dispersal description versus the interpretations of the practitioners of the energy dispersal description such as Harvey S. Leff an emeritus professor of physics at California State Polytechnic University, Pomona.

  17. stcordova: an emeritus professor of physics at California State Polytechnic University, Pomona.

    Being a professor does not make you immune from error Sal. So comments like this are really just appeals to authority which may or may not be justified.

  18. Sal, quoting Harvey Leff:

    S = 2NkB ln 2 for distinguishable gases,

    0 for identical gases.

    Let that sink in, Sal. Do you think the laws of physics “look” to see whether the gases are distinguishable to an observer before deciding whether and how the energy should disperse?

    The idea is ludicrous. The gas behavior, and any consequent energy dispersal (or lack thereof), is the same whether the molecules in Leff’s experiment are “black” or “white”. The only difference is that the observer can look to see where the “black” molecules are versus the “white” molecules, thus distinguishing between microstates that would be indistinguishable if the molecules were all the same “color.”

    Entropy is a measure of missing knowledge, not of energy dispersal.

  19. Here’s a thought experiment that proves the point.

    Imagine there are two observers, Yolanda and Xavier. Yolanda has the necessary equipment to distinguish between two isotopes of gas X, which we’ll call X0 and X1. Xavier lacks this equipment.

    Someone sets up a gas A/B experiment of the kind we’ve been discussing. The A chamber contains nothing but isotope X0, and the B chamber contains nothing but isotope X1. The partition is removed and the gases mix. How does the entropy change?

    Xavier, who can’t distinguish between isotopes X0 and X1, thinks the entropy doesn’t change. As far as he’s concerned, X was evenly distributed before mixing, and it’s evenly distributed after. He can’t distinguish between the before/after microstate ensembles.

    Yolanda, on the other hand, sees a massive change in entropy. There’s a huge difference in the before and after microstate ensembles: before the partition is removed, all of the X0 molecules are on one side, and all of the X1 molecules are on the other. After mixing, both isotopes are evenly distributed. This huge difference means that Yolanda sees a large entropy increase.

    Xavier sees no entropy increase. Yolanda sees a large entropy increase. Who is right?

    The answer is both.

    Yolanda possesses knowledge about the system that Xavier lacks, so the entropy increase is different for her than it is for him.

    Entropy is a measure of missing knowledge, not of energy dispersal.

  20. I love the way Sal triumphantly cites articles that he doesn’t really understand. He frequently then cites his citing of some article (or his use of a word!!) as evidence that he understands the subject.
    Sal, I suspect that Leff and Lambert probably understand that the dispersal metaphor depends on an assumption about the universe we live in, which assumption may or may not be true. That’s their get-out-of-jail card. Would you care to describe, in your own words, that get-out-of-jail card?
    As the Leff article you quote notes (did you read to the end?)

    The scope of this article is highly limited, being directed primarily at developingthe spreading metaphor as an interpretive tool. It is by no means clear whether the spreading concept can developed more fully mathematically, for example, for systems not in thermodynamic equilibrium. Were that possible, one might hope for a way to address non-equilibrium entropy. Similarly it is not clear if the spreading concept can be usefully extended to non-extensive systems characterized by long-range gravitational and/or electric forces.

    keiths’s definition is far more robust.

  21. keiths’s definition is far more robust.

    LOL! The most robust definition was provided in the OP by me:

    S = k ln W (Boltzmann/ Planck)

    and

    delta-S = Integral ( dq /T) (Clausius)

    where
    S = entropy
    k = boltzmann’s constant (or some other constant depending on units used)
    W = number of energy or position/momentum microstates
    dq = inexact differential change in heat (reversible)
    T= temperature

    So where is Keith’s definition? He quotes someone else. I’d hardly say that is more robust that the widely accepted 2 versions quoted in the OP.

    At issue is the pedagogical description to convey meaning of those 2 formal definitions (by Boltzmann/Planck and Clausius).

    But your attempts to put me down aren’t about discussing effective ways Chemistry and Physics teachers can teach the meaning of Boltzmann and Cluasisus equations. It’s about disagreeing with me at all costs.

    I provided several University level references, samples of university lectures and teaching materials, pedagogical articles that were published by the American Chemical Society.

    But noooooo….it’s not about the ideas of Lambert, Evans, Gupta, Leff , it’s the fact Sal likes those ideas.

    This has to be opposed at all costs. Sal has to be lectured and told he doesn’t understand, that he’s clueless. Doesn’t matter that Sal studied graduate level statistical mechanics and thermodynamics and Shannon’s theorems of information theory. Sal probably has more depth of knowledge than Keiths and DNA_jock on these matters, but that doesn’t matter. Sall has to pay Keiths and DNA_jock obeisance. Sal has to learn his physics and information theory from Keiths and DNA_Jock.

    I backed up my claims with citations from a paper form the American Chemical Society, a textbook on Statistical Thermodynamics, a paper by Leff, a chemical lecture video from Georgia Tech, and last but not least a paper that had similar notions by Lord Kelvin himself.

    In contrast, DNA_Jock and Keiths however have only referenced themselves and their mis-understandings. Too funny.

    The gospel according to Keiths:

    Keiths:

    Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix

    Epic Fail!

    Look again at the slide from a Chemistry lecture. On the lower left it says, “kinetic energy is concentrated in one compartment or the other!” before mixing and on the lower right “kinetic energy is dispersed over a larger volume” after mixing.

    This makes sense when one tracks the energy of each distinguishable species as I showed in my example calculations. But you guys just ignore stuff which is plain as day if one is willing to consider words and phrases more charitably rather be determined to mis-read and mis-understand.

  22. From this chemistry textbook:
    Chemistry and Chemical Reactivity
    by John C. Kotz, Paul M. Treichel , John Townsend

    The relevant pages are freely available here

    When the gas expands to fill a larger container, the average energy of the sample and energy for the particles in a given energy range are constant. However quantum mechnaics shows (for now, you will have to take our word for it) that as a consequence of having a larger volume in which the molecules can move in the expanded staet, there is an increase in the number of microstates and that those microstates are even more closely spaced than before (Figure 19.7). The result of this greater density of microstates is that the number of microstates available to the gas particles increases when the gas expands. Gas expansion, a dispersal of matter leads to a dispersal of energy over a larger number of more closely spaced microstates and thus to an increase in entropy.

    The logic applied to the expansion of a gas into a vacuum can be used to rationalize the mixing of two gases, the mixing of two liquids, or the dissolution of a solid in a liquid (Figure 19.8). For example, if flasks containing O2 and N2 are connected (in an experimental setup like that in figure 19.2), the two gases diffuse together, eventually leading to a mixture in which O2 and N2 molecules are evenly distributed throughout the total volume. A mixture of O2 and N2 will never separate into samples of each component of its own accord. The gases spontaneously move toward a situation in which each gas and its energy are maximally dispersed. The energy of the system is dispersed over a larger number of microstates, and the entropy of the system increases.

    In other words, epic fail for DNA_jock buying into Keiths mis-readings.

  23. stcordova: Doesn’t matter that Sal studied graduate level statistical mechanics and thermodynamics and Shannon’s theorems of information theory.

    Sad, but true, evidently.
    He still feels the need to cite introductory chemistry texts which say

    However quantum mechnaics shows (for now, you will have to take our word for it) that as a consequence of having a larger volume in which the molecules can move in the expanded staet[sic], …

    😮

    because he cannot explain anything in his own words.
    WHY (under your definitions) does it matter whether the gases are distinguishable or not? All you have got is a post-hoc rule to rescue your definition. Explain WHY Lambert et al. are willing to rely on this get-out-of-jail card, what assumptions are involved, and I’ll grant that you learnt something from your “graduate level” classes. OTOH, you’d also be a lot closer to realizing why keiths’s is right, so you might want to declare victory and leave.

  24. Keiths:

    Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix, but entropy most certainly increases.

    and

    DNA_Jock:

    Keiths is correct

    Contrast these statements with what is written in
    Chemistry and Chemical Reactivity by John C. Kotz, Paul M. Treichel , John Townsend.

    The gases spontaneously move toward a situation in which each gas and its energy are maximally dispersed. The energy of the system is dispersed over a larger number of microstates, and the entropy of the system increases.

    Translation: epic fail for DNA_Jock.

    Hey DNA_Jock, since you represent yourself such an authority. Consider the example of 1 mole of monoatomic ideal gas in 1 cubic meter at 300 K. Like in the top of the diagram below.

    Tell the readers the internal energy in Joules.

    Now let the monatomic gas be isothermally expanded to 2 cubic meters by opening the valve such as in the bottom of the diagram below.

    Is the energy more dispersed as a result of the isothermal expansion? Hahaha!

    That describes what happens to each species of substance when their is mixing as John C. Kotz, Paul M. Treichel , John Townsend said:

    The logic applied to the expansion of a gas into a vacuum can be used to rationalize the mixing of two gases, the mixing of two liquids,

    Keep posting comments DNA_jock, I’m enjoying this. Your skills as a critical thinker and scientist are a bit rusty. Here’s your opportunity for some remedial training.

  25. Sal,

    Let’s review:

    1. You endorsed Lambert’s definition of entropy as a measure of energy dispersal. That was wrong, because the energy distribution does not change in the mixing cases I described.

    2. Realizing your error, though not admitting it, you amended the definition to take account of species:

    It is the dispersal of energy associated with each species individually, not the dispersal of energy on the whole mixed solution.

    Unfortunately, your amended definition is also wrong, because it only applies to mixing cases. Entropy increases are not limited to mixing cases, obviously. You can have an entropy increase without “dispersal of species.”

    Lambert was smart enough to avoid that mistake, which is why you don’t see “species” mentioned in his definition of entropy.

    Also, your definition fails in the isotope-mixing case I described here.

    Your two definitions have failed, but you haven’t come up with a single case in which entropy cannot be seen as a measure of missing knowledge regarding a system’s microstate.

    Apart from the protection of your fragile ego, what reason can you offer for sticking to definitions that are known to be bad rather than accepting one that works?

  26. Sal:

    This has to be opposed at all costs. Sal has to be lectured and told he doesn’t understand, that he’s clueless. Doesn’t matter that Sal studied graduate level statistical mechanics and thermodynamics and Shannon’s theorems of information theory. Sal probably has more depth of knowledge than Keiths and DNA_jock on these matters, but that doesn’t matter. Sall has to pay Keiths and DNA_jock obeisance. Sal has to learn his physics and information theory from Keiths and DNA_Jock.

    LOL. Listen to yourself, Sal. Your fragile ego is interfering with the learning process, as it so often does.

    If your “graduate-level” “depth of knowledge” is so formidable, then you should easily be able to put me and DNA_Jock to shame.

    Get to it. Tell us, in your own deep, graduate-level words, what’s wrong with the arguments I’ve made here, here, and here.

    No more dodging. No more “But, I found a slide on the Internet that says so!” business.

    Let’s hear a counterargument. Preferably deep and graduate-level, as befits a man of your erudition.

  27. stcordova: Now let the monatomic gas be isothermally expanded to 2 cubic meters by opening the valve such as in the bottom of the diagram below.

    Is the energy more dispersed as a result of the isothermal expansion? Hahaha!

    That’s a fun example. But tell me, Sal, is the energy in the rest of the universe less dispersed as a result of this isothermal expansion? How can you tell?

    And in the case that we were actually discussing, WHY does the distinguishability of the gas in the other compartment affect the answer? [Cue Sal’s response: “Because these really reputable people told me it does” LOL]

  28. stcordova: Watch it and weep Keiths. You can’t pin the notion on me that I hold some sort of deviant view of entropy that doesn’t appear in universities.

    Right! Entropy is a measure of disorder (or is it a measure of order – I can never keep that straight), and that is taught in universities everywhere.

    Take that keiths! (and DNA-jock)

  29. : Entropy Change Upon Mixing

    A few questions:

    1. The entropy change, upon mixing, is greatest when Xa = Xb = 0.5.

    Is that the equilibrium state?

    2. Between Xa = 0.0 and Xa = 0.5 deltaS increases logarithmically with Xa.

    Is it also the case that:

    Between Xb = 0.0 and Xb = 0.5 deltaS increases logarithmically with Xb?

    And last, how is this not describable using Shannon’s Measure of Information (SMI)? Is the probability distribution unknown?

  30. Interesting. Each side has presented a case. and I am insufficiently expert on entropy to judge.

    Keiths and Dna_jock: are you also saying that a messy desk has more entropy than a neatly arranged desk?

  31. keiths: Lambert, unlike you, realizes that he needs a general definition, not one that applies solely to mixing gases.

    Bingo. There are numerous definitions of entropy.

    Does the meaning of entropy change depending on the circumstance? What then. of the Second Law of Thermodynamics?

    There is in fact a way to unify the meanings of entropy.

  32. keiths: Pardon my chutzpah. Perhaps you should append “As seen on TV the Internet!” to your claims, so that I’ll know not to challenge them in the future.

    Pardon my chutzpah. Perhaps you should append “As asserted by keiths!” to your claims, so that I’ll know not to challenge them in the future. 😉

  33. Joe,

    Keiths and Dna_jock: are you also saying that a messy desk has more entropy than a neatly arranged desk?

    No, we’re rejecting both the ‘entropy as disorder’ and ‘entropy as energy dispersal’ interpretations in favor of the ‘entropy as missing knowledge’ interpretation.

    That has the interesting consequence of making entropy an observer-dependent quantity in some cases, such as the isotope-mixing case I described here.

  34. stcordova: Keiths seems to be under the mistaken impression that “dispersal of energy” wasn’t a description that emerged from formal treatment of statistical mechanics and thermodynamics.

    keiths is directly challenging the claim that Salvador made in the OP [endorsing “Dr. Mike”] that entropy and information are unrelated.

    People like kairosfocus and myself have tried to explain why Sal is wrong, but consider the sources. 😉

    Therefore, Salvador must be right.

  35. I was originally taught that entropy was a measure of disorder and that it was an objective quantity (i.e. the same for all observers).

    The first of those was easier to unlearn than the second.

  36. So I take a new deck of cards, which of course is in suit and number order, and shuffle it, without being able to see the faces afterwards. At that point does it have higher entropy than it did?

  37. keiths: Yolanda possesses knowledge about the system that Xavier lacks, so the entropy increase is different for her than it is for him.

    🙂

    But surely there are different physical laws and physical forces in play here!

    Sal quoting Leff:

    “Here is an energy spreading interpretation.”

    There are in fact different interpretations of entropy. Sal is right about that.

  38. Poor Sal, beleaguered on all sides. Even by other theists. Even by other IDists. And after he’d come so far in his understanding of entropy. Entropy is not disorder. YAY!

    S = k ln W (Boltzmann/Planck)

    and

    delta-S = Integral ( dq /T) (Clausius)

    Which of these tell us what entropy is, the meaning of entropy?

  39. stcordova: Doesn’t matter that Sal studied graduate level statistical mechanics and thermodynamics and Shannon’s theorems of information theory.

    It matters. It matters that you claim to have studied at the graduate level in all these and you still can’t see [or deny] the connection between statistical mechanics, thermodynamics, and Shannon’s theorems of information theory when it comes to entropy. (I guess Sal is no von Neumann though.)

    If you understand all these, please demonstrate your understanding of the information theory approach to entropy in statistical mechanics and thermodynamics.

    Want some sources?

  40. Sal has me on ignore, but that’s ok, because much of that traces back to my disagreements with him about entropy over at UD. So I am loving this.

  41. Joe Felsenstein: So I take a new deck of cards, which of course is in suit and number order, and shuffle it, without being able to see the faces afterwards. At that point does it have higher entropy than it did?

    The entropy is not in the deck. 🙂

    Take any given ordering of the deck. How many yes/no questions must you ask to find a specific card?

  42. Now for DNA_jock’s remedial lesson.

    There is such a thing as a molecule. For a species of gas which we approximate as ideal, we can approximate each molecule of the species as the same (some slight differences because of atomic isotopes).

    Each molecule has an associated kinetic energy.

    For monoatomic gases, the kinetic energy of each molecule is nicely approximated as follows:

    KE = 1/2 m v^2

    where
    m = mass of the molecule
    v = translational velocity in space

    Now if we have a mole of these molecules, there are 6.022140857(74)×10^23 molecules.

    Each of these molecules has an associated kinetic energy for each molecule.

    There is an actual and also an average kinetic energy for each molecule in this mole of monoatomic gas. The convention is to express the average kinetic energy for each molecule this way:

    KE_average = 1/2 m v_avearge^2 = 3/2 k T

    where
    k = Boltzmann’s constant
    T = temperature in Kelvin

    For a mole of monoatomic ideal gas we multiply the number of molecules by the average kinetic energy of each monoatomic molecule to get the total kinetic energy.

    KE_total = N_A (3/2) k T = (3/2) N_A k T

    where
    N_A is Avogadro’s number

    It should be noted

    N_A k = R
    where R is the gas constant

    So the total kinetic energy for a mole of monoatomic ideal gas is:

    KE_total = N_A (3/2) k T = (3/2) N_A k T = (3/2) R T

    we could generalize this for monoatomic gases to :

    KE_total = (3/2) n R T

    where
    n = number of moles of gas.

    The formula is a little different for diatomic gases and other classes of ideal gases.

    If the potential energy in the system is of no experimental consequence, we can neglect it (as will usually be the case), and the internal energy of the mole of gas will be the total amount of kinetic energy residing in the gas. We designate the internal energy as U. Thus we can approximate

    U ~= KE_total

    For an ideal monoatomic gas:

    U = 3/2 n RT

    Independent of which interpretive, qualitative definition of entropy one is using ( Lambert’s, Sewell’s, whomever). The above holds.

    The internal energy for 1 mole of monoatomic ideal gas at 300K is

    (3/2) (1 mol) (8.314 x J/K/mol) (300K) = 3741.3 J

    Surprisingly, this amount of kinetic energy is independent of volume or pressure!

    We can thus put all these particle and thus their associated kinetic energies into a container such as the one depicted below.

    Let as call this species the “pink molecule” species represented by pink spheres in the diagram below.

    The particles (and energy that resides in them) is first concentrated in the left chamber of the system (top of the diagram) before the value is opened. Let the right chamber be a vacuum before the valve is opened.

    After the value is opened, the particles (and energy that resides in them) are dispersed to fill out both chambers. Thus this illustrates how both the particles (and the energy that resides in them) are dispersed in a larger volume even though the internal energy of 3741.3 J is the same before and after the opening of the valve.

    Apparently Keiths and DNA_jock can’t come to terms that the pink molecules(and the energy associated with the pink molecules) becomes dispersed after the valve is opened. Hahaha!

    Lambert, Evans, Gupta, Leff, Kotz, Treichel , Townsend , and just about anyone with a modest understanding of these things relates that this energy dispersal will apply in the case of mixing of two species.

    The logic applied to the expansion of a gas into a vacuum can be used to rationalize the mixing of two gases, the mixing of two liquids,

    — Kotz, Treichel , Townsend

    Just about everyone can see this is energy dispersal independently of how they qualitatively describe entropy. The only guys who refuse to see it are Keiths and DNA_jock.

  43. In light of my previous comment with the pink molecules, instead of a vacuum in the right chamber (1 cubic meter volume) we can put blue molecules at the same temperature.

    After the barrier between the two gases is opened, the pink molecules (and the energy in them) is spread out to fill both chambers, and likewise for the blue molecules (and the energy in them) is spread out to fill both chambers.

    Even though the pink and blue molecules may elastically collide with each other, there is no net change in the total internal energy of all the pink molecules and all the blue molecules respectively. This is a consequence of Joule’s law

    The internal energy of a fixed mass of an ideal gas depends only on its temperature (not pressure or volume).

    If the gases started out at the same temperature, even if the pink and blue molecules collide, their respective internal energies remain the same, and effectively, we have the spreading of energy residing in the pink molecules and the spreading of energy residing in the blue molecules. As depicted in this lecture slide which Keiths and DNA_jock seem completely unable to comprehend and refutes Keiths misinterpretation which he expressed as follows:

    Keiths:

    Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix, but entropy most certainly increases.

    As I demonstrated, that’s a strawman mis-interpretation of what it means to disperse energy in the case of mixing of two gases. Keiths is wrong and DNA_jock keeps drinking Keiths’ koolaide. HAHAHA!

  44. Joe Felsenstein:
    Here is Lambert’s position on the shuffled cards / messy desk issue.(I’m not taking a position).

    That was really interesting. I’m wondering to what extent the argument in this thread has resulted from the subtle shift in the meaning of the term that Lambert has outlined.

  45. stcordova: Just about everyone can see this is energy dispersal independently of how they qualitatively describe entropy. The only guys who refuse to see it are Keiths and DNA_jock.

    What does the probability distribution look like, Sal?

    If you know the probability distribution, then the Shannon measure of information can be applied, because it can be applied to any probability distribution. keiths and DNA_Jock are right. All that remains is how you can be convinced.

  46. Joe,

    So I take a new deck of cards, which of course is in suit and number order, and shuffle it, without being able to see the faces afterwards. At that point does it have higher entropy than it did?

    It depends on the kind of entropy you’re talking about.

    If you’re talking about thermodynamic entropy, as we have been so far in the thread, then I would say that there is only a slight difference between before and after in terms of entropy — assuming that the cards are mechanically unaffected by the shuffling and that their temperature remains unchanged. With those stipulations, each of the cards individually has the same thermodynamic entropy as before, but the deck as a whole has slightly higher entropy because you know less than you did before about its arrangement. That is, you know less about the microstate of the whole deck than you did before, even though your (lack of) knowledge of the individual cards’ microstates remains the same.

    On the other hand, you can define an entropy in which you ignore the thermodynamic details and instead just concentrate on the “logical” picture. In that case, the microstate of the deck is just the exact sequence in which the cards are ordered at any given time, and the entropy is just the logarithm of the number of microstates that are compatible with what you know about the arrangement of the deck.

    Under that definition of entropy, the deck has zero entropy at first, because you know the microstate exactly: the deck is in suit and number order, and there are no other possible microstates. With only one possible microstate, the entropy is zero, because the logarithm of one is zero.

    After a thorough random shuffle, you’ve lost all the information you had about the ordering of the deck. Now, as far as you are concerned, any of the 52 factorial orderings could be the actual ordering, so the entropy is log(52!).

Leave a Reply