In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths:
    It’s amusing to watch walto and Sal digging their hole deeper while relying on bogus arguments from authority.

    Right.

    Another pathetic non-response from the master of the ad hom. A guy with a view inconsistent with every chemistry textbook ever written but not a single reason for it (except, I guess, that I’m grumpy).

    I suggest you continue to be amused. It’s the only positive you’ve got going for you on this thread.

  2. stcordova: If a hot brick and a cold brick are in contact, the heat from the hot brick spreads into the cold brick. This spreading of heat (energy) describes the increase in entropy. Simple!

    Personally, I think it is the cold (lack of energy) that is spreading from the cold brick to the hot brick. But then, I’ve always been a “glass half full” kinda guy.

  3. Mung: Personally, I think it is the cold (lack of energy) that is spreading from the cold brick to the hot brick. But then, I’ve always been a “glass half full” kinda guy.

    Well, according to keiths, everybody’s right.

  4. Elizabeth never did manage to get Shannon information right. So appeals to Elizabeth don’t do much for me.

  5. It’s probably no secret by now that I rely heavily on Arieh Ben-Naim. I’ve tried for years to get Salvador to have a look at what Ben-Naim has to say about information and entropy.

    SMI – Shannon’s Measure of Information

    (1) Information theory is not part of thermodynamics. The laws of thermodynamics do not govern information.

    (2) SMI is not information, but a specific measure of a specific kind of information.

    (3) SMI is a very general concept; it is certainly not entropy.

    (4) Entropy is a particular case of SMI. It is defined on a particualr set of probability distributions of a thermodynamic system at equilibrium. (i.e. the distributions that maximize the relevant SMI). In this sense entropy is a very special kind of measure of information defined for a very special set of distribution functions.

    – Ben-Naim, Arieh. Information, Entropy, Life and the Universe. p. 271.

  6. walto: As shown above, Elizabeth had no more patience for keiths’ ‘ignorance view’ than do Elzinga, Lambert, or the authors of all the chemistry textbooks.

    You must do calculations based on the observed data.

    Suppose instead of colors we had left and right handed isomers. And only one person had access to a method for distinguishing them?

  7. walto writes, regarding the brick example:

    I guess that’s just MIT joining the University of Wisconsin in agreeing that entropy is a quantifiable non-observer-relative quantity. But what the hell do they know? After all, keiths has got at least one Calvinist on his side!

    It’s still observer-dependent. The macrostates are specified in terms of macroscopic variables like the temperatures of the two bricks. Nothing more is known about the microstates.

    To an observer who knew more than that, the macrostate would be “tighter”; that is, it would correspond to a smaller ensemble of epistemically possible microstates. For such an observer, the entropy would be lower.

    For Damon the Demon, who knows the exact microstate, the entropy is zero.

    Now, in most situations a human observer doesn’t know anything more than the value of the macroscopic thermodynamic variables like temperature, so that observer’s macrostate will be specified in those terms. Other human observers will do the same.

    They get the same answer — the same entropy value — not because that answer is the single correct one. It isn’t. They get the same value because they use the same macrostate.

    Entropy is a function of the macrostate. Different observers can use different valid macrostates, so the entropies they measure can differ.

    In practice (in the brick example, for instance) different observers will calculate the same entropy because they use the same macrostate.

  8. petrushka: You must do calculations based on the observed data.

    Yes, and you must be careful not to confuse what you have been able to ‘take in’ with the whole story of what’s out there. We are limited observers.

  9. keiths: To an observer who knew more than that, the macrostate would be “tighter”

    As I’ve said, you can repeat this another hundred times if you like. What you CAN’T seem to do is give a reason for thinking it’s true. On any normal view, this different estimate means it would APPEAR tighter–not that it IS. The entropy is whatever it is, regardless of how it appears to this or that observer.

    keiths: For Damon the Demon, who knows the exact microstate, the entropy is zero.

    Again, this is the reductio of your position. Your Laplacian genius can’t tell the entropy of any state from any other. It makes the entire concept useless. Alive or dead, everything has the same entropy if we’re smart enough. In reality, beliefs aren’t things, they are ABOUT things. The things are there (or not) anyhow.

    keiths: Now, in most situations a human observer doesn’t know anything more than the value of the macroscopic thermodynamic variables like temperature, so that observer’s macrostate will be specified in those terms. Other human observers will do the same.

    They get the same answer — the same entropy value — not because that answer is the single correct one. It isn’t. They get the same value because they use the same macrostate.

    The thing is, they DON’T always get the same value–as in your own Yolanda and Xavier example and in my examples where different equipment is used for measuring temperature or mass. You make everyone correct even when they disagree. It’s silly, really.

  10. keiths:

    When Granville talks about skyscrapers arising on once-barren planets and houses being assembled by tornadoes, he’s talking about macrostates, not microstates.

    fifth:

    Is there some logical reason why your concept is limited to the micro verses macro?

    It’s the other way around. Entropy is a function of the macrostate, not the microstate.

    When I said that Granville is talking about macrostates, I wasn’t criticizing him for that (though he makes plenty of other errors). I was pointing out that your characterization of him was wrong:

    3) Granville Sewell might very well be correct If he can show that he is talking about an actual and not a theoretical micro state

  11. Following up on the energy spreading metaphor….

    Imagine energy is like butter, spreading energy is like spreading butter. The more butter spread on a piece of bread, the more entropy (figuratively speaking).

    Here is an even simpler example of spreading heat energy (on bread so to speak). Spread the extra heat energy on bunch of copper coins and watch the entropy go up! The following would be the sort of calculation that would be within the abilities of a typical undergrad in chemistry, physics or engineering.

    adapapted from:

    2LOT and ID entropy calculations (editorial corrections welcome)

    PROBLEM:
    If we have 500 copper pennies at 298K (about room temperature) and spread some heat on the pennies to raise the temperature to 373 Kelvin. What is the change in entropy in the pennies?

    SOLUTION:
    Mass of a copper penny 3.11 grams

    Specific heat of copper 0.39 J/gram

    Heat capacity C of 500 copper pennies is:

    C = 0.39 J/gram/K * 500 pennies * 3.11 grams/penny/K = 606 J/K

    T_initial = 298 K
    T_final = 373 K

    To calculate the change in entropy I used the formulas from:
    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node41.html

    Change in Entropy =

    delta-S_thermodynamic =

    C ln ( T_final/T_initial) = 606 J/K ln (373/298)

    = 136.13 J/K

    SIMPLE!

    Spread some heat, and watch the temperature change and using standard tables of heat capacity, you can calculate entropy using a formula worked out by physicists. Just plug the numbers in the right place in the right equation.

    Now, isn’t that a lot easier than saying, “let’s calculate the increase of ignorance in our minds as temperature is increased in order to calculate entropy”?

  12. OK, as I’m bored with this and have reached the point of repetition myself, I will sum up. keiths thinks the concept of entropy ought to be redefined as an observer-relative property. On that view each person’s measurement of entropy will be correct, so long as no positive mistake has been made. I call this a redefinition, because it is clearly not how entropy is defined in chemistry texts, university websites, Lambert’s papers, etc.

    Now I have no particular problem with the definition of “k-Entropy” as a quantification of ignorance. I don’t know what use it would have, and keiths has provided none, but if there is indeed a use for it, there is no harm in having the term available–as long as it’s clear that that’s not what anybody else means by “entropy.”

    keiths likes to do this: he has his own meanings of “know” “good” and who knows what all else. If he wants to put together a keiths language, fine. Is it better? More useful in some way? I think that remains to be seen. But in any case, he should use more asterisks so people will always understand what the hell he’s talking about and equivocations won’t multiply.

  13. Ok Keiths,

    For the benefit of the readers, show the increase in entropy of 500 copper pennies from 298K to 373K by showing the increase in ignorance of the observer.

  14. stcordova,

    Hi Sal
    Do you agree with this. If so is Boltzman’s equation a measure of information?

    An additional source of confusion to anyone outside of chemistry or physics is due to brilliant but thoughtless mathematicians. They decided that it would be amusing to name a very important new mathematical function communication “entropy” because “no one knows what [thermodynamic] entropy really is, so in a debate you will always have the advantage”. (That quote is of John von Neumann speaking to Claude Shannon (Sci. Am. 1971 , 225 , 180.) For the past half-century this has turned out to be a cruel practical joke imposed on generations of scientists and non-scientists because many authors have completely mixed up information “entropy” and thermodynamic entropy. They are not the same! From the 1860s until now, in physics and chemistry (the two sciences originating and most extensively using the concept) entropy has applied only to situations involving energy flow that can be measured as “heat” change, as is indicated by the two-word description, thermodynamic (“heat action or flow”) entropy. Only thermodynamic entropy will be dealt with in this article.

  15. colewd:

    For the past half-century this has turned out to be a cruel practical joke imposed on generations of scientists and non-scientists because many authors have completely mixed up information “entropy” and thermodynamic entropy. They are not the same!

    Yes I agree with this. I showed the differences in the following discussion with numbers worked out in a system of 500 fair coins. Short overview:

    If we let each copper coin represent a Shannon bit, the number of shannon entropy head/tails bits in a 500 fair coin system is 500 bits, but the number of thermodynamics bits is 8.636 x 10^25 bits — an insanely large number.

    2LOT and ID entropy calculations (editorial corrections welcome)

    If so is Boltzman’s equation a measure of information?

    Depends on what means by information! This is like asking how much information is in a brick or a block of copper. If we melted the copper pennies in the above link into a block of copper, the number of thermodynamic bits would still be 8.636 x 10^25 bits. But an electrical engineer or computer scientist would probably balk that this number is useful for building an information processing system like a disc drive or computer RAM.

    8.636 x 10^25 bits is a lot of bits, but not in a form of anything of much use to the Information Technology industry, otherwise we’d be using blocks of copper for mass information storage.

    The thermodynamic bits here is merely just a way of counting thermodynamic microstates of the system. The definition of microstate will take a day to teach, it’s not trivial. Lambert is as good as anyone for trying to explain it:
    http://entropysite.oxy.edu/microstate/

    If one can avoid computing the number of microstates to compute entropy, that would be desirable. The famous formula is

    S = kB ln W

    where

    S = entropy
    kB = Boltzmann’s constant
    W = number of microstates

    In my penny example:

    W = exp ( 8.636 x 10^25 x ln(2) )

    which is a huge number. I arrived at this number by cheating and taking short cuts, not working from first principles. How I derived it is here:

    http://creationevolutionuniversity.com/forum/viewtopic.php?f=4&t=72

  16. The metaphor for “energy dispersal” is a nice qualitative way to conceptualize entropy.

    If we have and object and throw more heat at it and raise its temperature, we’ve increased it’s entropy. The tedious part is selecting the right formula from off the shelf to get the right numbers, but those are details. The intuition is right. We pump (dispersed, spread) more energy into the object, therefore more entropy — like spreading more butter on bread.

    We could also physically spread apart the particles in gas at the same temperature. This is another notion of dispersal. This increases entropy. The trick is selecting the right formula to use to calculate the entropy. But those are the details left for a semester of agony in a thermodynamics and statistical mechanics class.

    But the basic intuitions are there if one uses the dispersal model: more heat or more spread out heat (at the same temperature) increases entropy. The rest are details.

  17. It may be helpful to some readers to see why there is so much hoopla surrounding entropy, especially in relation to the question of biology and biochemistry.

    There is often the question if a chemical reaction will happen spontaneously. Obviously, this has a lot of relevance to the Origin of Life question. Here is a tutorial diagram that shows how calculations of entropy can be used to help settle question of which chemical reactions will happen spontaneously and hence how feasible certain Origin of Life scenarios are:

  18. stcordova,

    And do those relationships hold with an observer-relative concept of entropy? I’d think that could be the case only if at least one of other terms in that equation is observer-relative? But which one(s)? Gibbs free energy? Temperature? Who knows?

    Again, the epistemic version of entropy seems to show itself as basically useless for science.

  19. And I think that “thermodynamic bits” is something Sal manufactured, along with his calculation of “thermodynamic bits.” I bet that Larry Moran and Mike Elzinga would laugh at the idea of “thermodynamic bits.”

  20. A macrostate is the thermodynamic state of any system that is exactly characterized by measurement of the system’s properties such as P, V, T, H and number of moles of each constituent.

    http://entropysite.oxy.edu/microstate/

    In physics, a partition function describes the statistical properties of a system in thermodynamic equilibrium. Partition functions are functions of the thermodynamic state variables, such as the temperature and volume. Most of the aggregate thermodynamic variables of the system, such as the total energy, free energy, entropy, and pressure, can be expressed in terms of the partition function or its derivatives.

    https://en.wikipedia.org/wiki/Partition_function_(statistical_mechanics)

  21. stcordova: But the basic intuitions are there if one uses the dispersal model: more heat or more spread out heat (at the same temperature) increases entropy. The rest are details.

    Could someone ask Salvador if he thinks heat and energy are the same thing?

  22. Walto,

    And do those relationships hold with an observer-relative concept of entropy?

    That’s a good Keith’s question. 🙂 Heck if I know or care.

    If a chemistry professor asks me to compute the entropy change during a chemical reaction, I’m sure as heck not going to give an answer in terms of epistemic observer dependent entropy in units of Keiths ignorance.

    I tried to google “observer dependent entropy chemistry” and got maybe one hit in favor, a few hits against. Oh well.

    In contrast, I got plenty of google hits with “entropy energy spreading” to papers and essays by chemistry and physics professors.

  23. In addition to the exercise proposed by keiths for Salvador to solve, here’s another:

    Calculate the change in entropy in the expansion of an ideal gas from volume V to 2V.

    1. When the initial temperature of the gas is 20 degrees C.
    2. When the initial temperature is 100 degrees C.
    3. When the initial temperature is 500 degrees C.

    Prediction. Sal won’t.

  24. Ben-Naim:

    Clausius, among many others, noticed that many spontaneous processes always proceed in one direction. Examples are:

    1. The expansion of a gas, from V to 2V upon the removal of a partition between the two chambers.
    2. The mixing of two gases.
    3. Heat transfer from a hot to a cold body.

    Any of the processes … may be viewed as a change from one well-defined state to another well-defined state. In the first two there is no heat transfer involved, yet one can devise a process which will carry the system from the initial to the final state, and use Clausius’ formula, dS = dQ/T, to calculate the change in the entropy of the system.

    Mung: It’s not about the heat, Sal.

  25. Mung:
    In addition to the exercise proposed by keiths for Salvador to solve, here’s another:

    Calculate the change in entropy in the expansion of an ideal gas from volume V to 2V.

    1. When the initial temperature of the gas is 20 degrees C.
    2. When the initial temperature is 100 degrees C.
    3. When the initial temperature is 500 degrees C.

    Prediction. Sal won’t.

    Mung:
    Ben-Naim:

    Clausius, among many others, noticed that many spontaneous processes always proceed in one direction. Examples are:

    1. The expansion of a gas, from V to 2V upon the removal of a partition between the two chambers.
    2. The mixing of two gases.
    3. Heat transfer from a hot to a cold body.

    Any of the processes … may be viewed as a change from one well-defined state to another well-defined state. In the first two there is no heat transfer involved, yet one can devise a process which will carry the system from the initial to the final state, and use Clausius’ formula, dS = dQ/T, to calculate the change in the entropy of the system.

    Mung: It’s not about the heat, Sal.

    Just putting these up for Sal to see.

  26. Mung:

    And I think that “thermodynamic bits” is something Sal manufactured, along with his calculation of “thermodynamic bits.”

    “Thermodynamic bits” isn’t a standard term, but Sal is correct that thermodynamic entropy can be expressed in bits. The fact that it’s usually expressed in units of joules per kelvin (J/K) is an accident of history, due to the definition of the kelvin as a base unit. Had the kelvin been defined in terms of energy, joules in the denominator would have cancelled out joules in the numerator and the clunky J/K notation would be unnecessary.

    Here’s the hilarious part. By pointing out that thermodynamic entropy can be expressed in bits, Sal is torpedoing his own position and supporting mine.

    My claim is that entropy is a measure of missing information. Sal claims it is a measure of energy dispersal. What are the units of information? Bits. What are the units of energy dispersal? Joules per cubic meter.

    Sal’s units are wrong, and mine are right — according to Sal himself.

    An excellent foot-shot by the master himself. Good work, Sal. 🙂

  27. keiths:

    For Damon the Demon, who knows the exact microstate, the entropy is zero.

    walto:

    Again, this is the reductio of your position. Your Laplacian genius can’t tell the entropy of any state from any other. It makes the entire concept useless.

    It’s exactly the opposite. It shows that your position is absurd and that it renders entropy comparisons useless.

    Here’s why. According to you, Xavier and Yolanda both calculate the wrong entropy value, but Yolanda’s value is closer to being correct because she has more information than Xavier — she knows about the isotope distribution, while he doesn’t. By that reasoning, Damon’s answer is better than both Xavier’s and Yolanda’s, because he has more information than either of them. In fact, his information is perfect — he knows the exact microstate. No observer could do better. Therefore, by your logic, Damon’s entropy value — zero — is the correct value.

    Your reasoning leads to the (faulty) conclusion that entropy comparisons are useless, because the correct value is always zero. In my scheme, entropy comparisons are useful because unlike Damon, we never have perfect information about the microstate. Entropy values are nonzero for us.

    That was a pretty good foot-shot, walto, but not as good as Sal’s.

    What’s going on? Are you and Sal competing to see who can do worse at thermodynamics? 🙂

  28. Just putting these up for Sal to see.

    Since Walto posted this, in deference to Walto I’ll reply, otherwise since Mung is on my ignore list, I wouldn’t have trifled.

    The answer is N kB ln(2) for all temperatures in an isothermal adiabatic expansion. Which is done via using Clausius definition for entropy change, not Keiths ignorance procedure.

    Starting with Clausius definition of entropy:

    delta S = Integral (dq/T) reversible

    where
    dq = differential amount of energy to restore the system back to initial conditions
    T = temperature of the process

    When the gas is isothermally expanded to 2V from V, let

    q = the amount of work that needs to be done to compress the gas volume from 2V back to V. Using the work energy theorem,

    q = Integral (PdV) = Integral (dq)

    thus

    dq = PdV

    thus the Clausius integral can be stated as:

    Delta-S = Integral ( PdV/T) evaluated from V=1 to V = 2

    where P is pressure, V is volume, and dV is the differential change in volume

    Using the ideal gas law PV = nRT = NkBT,

    P/T = NkB/V

    thus we can further restate the Clausius integral as

    Delta-S = Integral ( NkB dV / V) evaluated at V =1 to V = 2

    thus
    Delta-S = N kB ln(2)

    Mung didn’t specify the number of molecules or moles of the ideal gas, so N is used as a place marker for his lack of providing adequate details. Mung can fill in the details himself.

    If we are dealing with 1 mole of ideal gas, let N=1 mole of molecules and the answer is 5.426 J/K. For other quantities of N, just scale the value of entropy change for one mole of molecules accordingly.

    So Mung can now show the readers how expanding gas increases Mung’s ignorance, then Mung can quantify his ignorance in bits (or whatever measure), and then convert his bits of ignorance into the right answer expressed in Joules/Kelvin.

    Mung should show his work. But Mung is just trolling. He can’t start from the Clausius integral like I have since the Clausius integral involves change in energy (dq), whereas Mung says we should frame the entropy change in terms of his ignorance not in terms of energies required to reverse the expanded system to its initial condition.

    So Mung, time for you to show the readers your ignorance. Keiths can do the same (really more of the same in Keiths’ case).

    there is no heat transfer involved

    But dq refers to energy that is hypothetically needed to restore the system to initial conditions, not necessarily heat transfer. You’re freaking clueless Mung.

  29. Sal,

    You walked right into Mung’s trap. (Mung, did that question come from one of Ben-Naim’s books?)

    You claim, along with Lambert and “Dr. Mike”, that entropy is a measure of energy dispersal. You calculated the entropy change for the V to 2V expansion at three different temperatures and got the same answer for all three. That’s correct.

    What you failed to notice is that the energy dispersal differs among all three.

    Energy varies with temperature, so the gas has the least energy at 20 degrees, more at 100 degrees, and the most at 500 degrees. Because the expansion is isothermal, the temperature (and thus the energy) remains unchanged in each case.

    Since different amounts of energy are being spread over the same new volume, energy dispersal also differs.

    The change in entropy is the same, but the energy dispersal differs. Therefore entropy cannot be a measure of energy dispersal.

    Your aim is uncannily good — but only when you’re pointing the gun at your foot.

  30. stcordova: Delta-S = N kB ln(2)

    [snip]

    So Mung can now show the readers how expanding gas increases Mung’s ignorance, then Mung can quantify his ignorance in bits (or whatever measure), and then convert his bits of ignorance into the right answer expressed in Joules/Kelvin.

    Let’s see. Each of the N molecules could either be in the left hand side, or the right hand side. So our ignorance about the true microstate has increased by 2^N.
    That’s N bits
    = N ln(2) nats
    = N kB ln(2) J/K.
    Oh. That was really easy. Remember Sal, W is the the number of possible microstates that can produce the given macrostate…

  31. keiths: Mung, did that question come from one of Ben-Naim’s books?

    Yes. In one of his books (Entropy and the Second Law: Interpretation
    and Misss-Interpretations
    ) he specifically addresses the “spreading” metaphor.

  32. keiths: For Damon the Demon, who knows the exact microstate, the entropy is zero.

    walto:

    Again, this is the reductio of your position. Your Laplacian genius can’t tell the entropy of any state from any other. It makes the entire concept useless.

    It’s exactly the opposite. It shows that your position is absurd and that it renders entropy comparisons useless.

    Here’s why. According to you, Xavier and Yolanda both calculate the wrong entropy value, but Yolanda’s value is closer to being correct because she has more information than Xavier — she knows about the isotope distribution, while he doesn’t. By that reasoning, Damon’s answer is better than both Xavier’s and Yolanda’s, because he has more information than either of them. In fact, his information is perfect — he knows the exact microstate. No observer could do better. Therefore, by your logic, Damon’s entropy value — zero — is the correct value.

    I hope you realize that that’s actually question-begging again. Anyone who doesn’t agree that entropy is a measure of ignorance, will also not agree that Damon’s answer (i.e., the correct answer) is zero.

    It doesn’t follow from the fact that Yolanda’s answer is lower than Xavier’s that the correct answer is zero. Duh.

    Are you really this stupid?

  33. Let’s see. Each of the N molecules could either be in the left hand side, or the right hand side. So our ignorance about the true microstate has increased by 2^N.

    But Mung’s personal ignorance can increase by more than 2^N since he can be more ignorant than that. In fact if he’s on the other side of the world, he may not even know the gas is expanding or contracting. You want to teach thermodynamics that way, go ahead. But I’m not tying thermodynamic quantities to people’s level of knowledge, especially Mung’s.

    Keiths:
    What you failed to notice is that the energy dispersal differs among all three.

    What makes you think I wouldn’t notice? With respect to the question, absolute dispersal is not needed to compute change in dispersal.

    At each temperature the absolute dispersal is different, but the change in dispersal is the same independent of temperature of each process. Magical isn’t it? And this is reflected in the answer given.

    Remember Sal

    More condescending BS from DNA_jock. Pretending as if you can’t find the conversion factor (kB) in my writings over a year ago, like uh:

    2LOT and ID entropy calculations (editorial corrections welcome)

    1 nat = (1.381x 10^-23) (J/K)

    Now you and Keiths do your ignorance procedure for the case of 500 copper pennies going from 298K to 373K without resorting to the Clausius definition first (as I did) and then doing a conversion back to bits, nits, or nats. Same for Keiths. Same for Mung.

    How are you going now to account for increase ignorance when internal energy increases since you pooh pooh accounting for the dispersal of energy? Not so straight forward as volume expansion eh?

    You’re going to have to account for energy and the dispersal of energy around microstates at some point. I suppose you can avoid the formal definition of energy by indirectly using position and momentum (use this relation) then you’ll have to relate momentum changes back to changes in temperature, but most literature about temperature is with respect to energy not momentum. Ouch!

    And don’t cheat like I did using the Clausius intergral (since I don’t start from the ignorance method like Keiths) which invokes energy changes like dq as in:

    Integral (dq/T)

    You have to start with ignorance and end with ignorance and not reference energy since you pooh pooh the idea of how energy is spread around. Instead show how your ignorance is spread around.

    I can use energy in my definition of entropy since I’m in the energy dispersal school of thought, and that makes the analysis easier.

    All Keiths has is his ignorance to work with, not energy. And that makes the analysis and hence the teach pedagogical metaphor much harder to work with.

    I suppose Keiths could just try to work with temperature, but that’s going to be hard since temperature scales are defined using concepts of energy, not ignorance.

    Go ahead Keiths, try showing change of entropy happening when there is an increase in internal energy in 500 copper pennies going from 298K to 373K. You can probably do it in principle, but it won’t be so tidy as in the entropy of expansion. There’s a reason certain conventions have been adopted in the procedures of practical science.

    I chose the way of energy, you choose the way of ignorance. To each his own.

  34. I’m content with Yolanda’s answer, barring any improvements in equipment. It’s clearly better than both Xavier’s and zero.

    As I said, I don’t mind if you want to re-define (or better newly define) “k-entropy” as a measure of ignorance, but surely you don’t think you’re talking about the same thing Lambert is. You could just ask him.

    Do you honestly have no idea that you’re talking about something else?

  35. keiths:

    What you failed to notice is that the energy dispersal differs among all three.

    Sal:

    What makes you think I wouldn’t notice?

    If you did notice, that makes your error even more ridiculous.

    At each temperature the absolute dispersal is different, but the change in dispersal is the same independent of temperature of each process.

    No, the change in dispersal is different for each.

    Do the math, Sal.

    ETA: And show your work.

  36. Mung:

    Salvador’s lost it. =p

    So has walto.

    Sal tells us he’s “a mediocre student of science and engineering at best”.

    Walto describes himself as “scientifically ignorant”.

    Neither one hesitates to lecture the rest of us on thermodynamics.

  37. walto,

    I’m content with Yolanda’s answer, barring any improvements in equipment. It’s clearly better than both Xavier’s and zero.

    You told us that Yolanda’s answer is wrong.

    What is the right answer? You have all the information you need to do an entropy calculation.

  38. Note to Salvador. There is a worse sort of ignorance than any I might be displaying in this thread.

  39. Question for Sal and walto:

    If…

    a) thermodynamic entropy is a measure of the dispersal of energy, as you say,

    …and…

    b) it is not a measure of missing information about the microstate, as you also say,

    …then…

    c) why is it that entropy cannot be expressed in units of energy dispersal (joules per cubic meter),

    d) but can be expressed in bits?

  40. stcordova,

    We could also physically spread apart the particles in gas at the same temperature. This is another notion of dispersal. This increases entropy. The trick is selecting the right formula to use to calculate the entropy. But those are the details left for a semester of agony in a thermodynamics and statistical mechanics class.

    This seems to be a good way to think about it. The Boltzman equation seems impractical to me and perhaps the cause of what has mislead science. A practical step is to match the math with the experimental evidence. Another choice is to hopelessly try to imagine possible micro states that according to your numbers add up to thousands of orders of magnitude greater possibilities then the number of atoms in our universe.

  41. Let Keiths show the reader’s the superiority of using ignorance over energy in teaching thermodynamics.

    Here is a simple question:

    Show change of entropy when 500 copper pennies go from 298K to 373K.

    But there is a problem. Temperature is described with reference to energy, not ignorance!

    See here:
    https://en.wikipedia.org/wiki/Temperature#Definition_of_the_Kelvin_scale

    So right off the bat, Keiths has to start off talking about energy, his starting point can’t be ignorance. He’s Stuck and Out of Luck (SOL).

    In any case, do your best Keiths. Explain for your prospective students how the entropy change going from 298K to 373K can be described in terms of your increase in ignorance with minimal recourse to changes (dispersal) of energy.

    I did the calculations last year here at TSZ because I can appeal to the changes in energy since I use accepted, time-tested approaches to calculating entropy which are at their root an analysis of dispersal of energy.

    You however want to impose your methods of ignorance on me. No dice Keiths.

  42. keiths,

    Well, if I were trying to convince walto of the information approach I’d probably try to show that you get the same results using SMI that you get using any of the other ways of determining the entropy, not that different people get different results, lol.

    my .02c

    I’m a novice when it comes to thermodynamics, I admit it. So I have to stay humble.

  43. DNA_Jock:

    Each of the N molecules could either be in the left hand side,

    Why? Because they are more DISPERSED than before? Hahaha!

  44. keiths:
    Mung:

    So has walto.

    Sal tells us he’s “a mediocre student of science and engineering at best”.

    Walto describes himself as “scientifically ignorant”.

    Neither one hesitates to lecture the rest of us on thermodynamics.

    In my own case that’s because your position has nothing to do with thermodynamics. It’s a proposal to change a definition used in thermodynamics to give it an epistemic meaning. It’s just bad, confused philosophy.

    If you could defend this position I’d give you a pass. But so far, nothing.

  45. Mung,

    Well, if I were trying to convince walto of the information approach I’d probably try to show that you get the same results using SMI that you get using any of the other ways of determining the entropy, not that different people get different results, lol.

    He’s disputing observer dependence, so of course I have to show that different people can get different (but correct) results.

  46. stcordova:
    Let Keiths show the reader’s the superiority of using ignorance over energy in teaching thermodynamics.

    Here is a simple question:

    But there is a problem.Temperature is described with reference to energy, not ignorance!

    See here:
    https://en.wikipedia.org/wiki/Temperature#Definition_of_the_Kelvin_scale

    So right off the bat, Keiths has to start off talking about energy, his starting point can’t be ignorance.He’s Stuck and Out of Luck (SOL).

    In any case, do your best Keiths. Explain for your prospective students how the entropy change going from 298K to 373K can be described in terms of your increase in ignorance with minimal recourse to changes (dispersal) ofenergy.

    I did the calculations last year here at TSZ because I can appeal to the changes in energy since I use accepted, time-tested approaches to calculating entropy which are at their root an analysis of dispersal of energy.

    You however want to impose your methods of ignorance on me.No dice Keiths.

    Yes. He wants to change a useful definition to an apparently useless one. Why? None of us has any idea.

    In addition, he thinks he’s making some sort of point about thermodynamics by recommending a definition change. Obviously, that’s just a confusion on his part.

Leave a Reply