In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths: Damon learns the exact microstate by observing it

    And his knowledge before his observation of the microstate? And his observation of the macrostate? And his knowledge before his observation of the macrostate?

    And we know his observations are always correct because you stipulate that his observations are always correct? And you know this how?

    And Damon observes the microstate at all times, and the microstate is the same at all times? So who the hell needs statistical thermodynamics or statistical mechanics?

    The microstate is always changing. Does Damon know this and just ignore it when he determines the exact microstate? And Damon’s knowledge before his observation of the microstate?

    And Damon’s knowledge of the macrostate, where does that come from? Does it come from observing the state of the macrostate variables? how does Damon know that the microstate he observed is the same as the macrostate he observed?

    Apart from your declaration that it is so?

  2. keiths, do you agree that the microstate is changing, even at equilibrium?

    If so, what Damon observes to be the microstate does not depend on his knowledge of the macrostate, and what Damon observes to be the macrostate does not depend on his knowledge of the microstate?

  3. Mung,

    1.) Your position on what is or is not a process in thermodynamics.

    What’s the error?

    2.) Your position that all information is measured in bits.

    Read the exchange again, Mung:

    Gell-Mann:

    When it is known only that a system is in a given macrostate, the entropy of the macrostate measures the degree of ignorance about which microstate the system is in, by counting the number of bits of additional information needed to specify it, with all the microstates in the macrostate treated as equally probable.

    Mung:

    Would that “additional information” be “thermodynamic information” measured in “thermodynamics bits”? Don’t go all Salvador on me!

    [Trust me, Gell-Mann is no Salvador.]

    keiths:

    Information is measured in bits, Mung. Do you think your checking account information is measured in “checking account bits”?

    Mung:

    No, it is not. You are simply wrong. And this is further evidence that you are in fact conflating the colloquial sense of information with the Shannon measure of information. No one knows how to measure colloquial information. People can’t even agree on a definition of colloquial information.

    Gell-Mann isn’t talking about “colloquial information”. He’s talking about “the number of bits of additional information needed to specify it [the microstate]”.

    “Bits of additional information.” Why, that sounds like something that could be measured in bits, doesn’t it?

  4. Mung,

    You haven’t answered this question:

    Mung:

    Damon is not calculating the entropy. One of the other two is simply wrong, because his/her belief about the macrostate is false.

    keiths:

    Which one, and why?

  5. Walto hasn’t answered it either.

    If the two of you are correct about entropy being observer-independent, you should be able to tell us who (if any) of Xavier, Yolanda, and Damon gets the correct value of entropy, and why.

    If the answer is “none of them”, you should be able to tell us how one could go about determining the correct answer.

  6. keiths:

    The Clausius equation only works in certain circumstances.

    It doesn’t work for the gas-mixing case we’ve been discussing, for instance. The process isn’t reversible and there is no dQ or ΔQ. If you try to use the equation, you’ll get an entropy change of zero, which is incorrect.

    Mung:

    It’s Salvador’s subjective model of choice. If X, Y, and D can all be right, why can’t Sal be right?

    Equations aren’t macrostates. Do you actually believe that if entropy is observer-dependent, it means that no one can get an incorrect answer? Not even if they apply an equation incorrectly?

    Ditto for this:

    It is or it is not subjective? Assume Salvador is just a “weird” observer. Salvador’s observation is different from yours. Which of you is wrong? And why?

  7. Mung,

    Yes, the microstate is constantly changing, and yes, Damon knows it at each instant. How? He exercises his demonic superpowers. How does that work? Who knows, and who cares? It’s a thought experiment, and no one is claiming that Damon is real.

    The question is, if an observer like Damon existed who knew the exact microstate at all times, what would the entropy be? The answer, of course, is zero.

    If only one microstate at a time is epistemically possible for Damon, then W is always 1 for him and kb ln W is equal to zero. There is no missing information between his macrostate and the exact microstate.

  8. Sal,

    You have the teachability of a rock. I’ve already addressed every point you raised in your comment.

    In any case, we could, if we wanted, use the log base 2 instead of natural log, and we can also drop Boltzmann’s constant kB and define a Shannon-like entropy :

    S = log2 (W) log2 (9.52 x 10^31) = 106.23 Shannon bits

    Thus we can force-fit information theory type constructs on entropy just by adopting a different logarithmic measure and dropping kB. Big deal!

    It isn’t a force-fit. Entropy is naturally measured in units of information. The force-fitting comes when you bring kb into the equation, and that only happens because people want to get the J/K units that are standard in thermodynamics.

    As explained earlier in the thread, the J/K units are an artifact of history, due to the fact that the kelvin was defined before atomic theory was established. Otherwise temperature would be defined in terms of energy and the energy terms in kb would cancel, leaving a dimensionless constant.

    I fail to see why all the hoopla and deepity woo observer-dependent ignorance and information theory in the definition of entropy.

    I agree that you don’t get it. You’re rejecting something that you don’t understand.

    Amusingly, Harvey Leff, whom you cited as an authority earlier in the thread, gets it:

    A very important insight that emerges from Jaynes’ work is that thermodynamic entropy can be interpreted as a measure of missing information, or equivalently, uncertainty. This is an interpretation that has a solid basis in mathematics, unlike the poor, but too commonly used disorder metaphor.

    Sal:

    Keiths can’t seem to comprehend, the number of microstates is not experimentally possible to establish in many cases without the energy dispersal parameters defined by HEAT and TEMPERATURE just as Dr. Mike pointed out.

    And you can’t seem to comprehend that of course heat and temperature are relevant when doing thermodynamic calculations, but that doesn’t make entropy a measure of energy dispersal. I’ve given six reasons why. You haven’t refuted a single one of them.

    Keiths and DNA_jock are too much of dabblers to contend with Frank Lambert and Dr. Mike who actually taught these subjects in University. They’re just eager to disagree with me, especially DNA_jock who resorts to misreading and grabbing at straws in attempt to claim I don’t understand this stuff.

    Keiths disagreed with Dr. Mike as well, who was a professional thermodynamicist working on extremely cold refrigerators.

    Do you recognize how dumb that argument is, Sal? Lambert and “Dr. Mike” taught thermo, and “Dr. Mike” worked on extremely cold refrigerators, therefore neither can be mistaken?

    I pretty much diss the Keiths, DNA_Jock school of entropy. I can’t imagine one will go very wrong following Frank Lambert and Dr. Mike.

    I’ve shown why you do go wrong following Lambert and Elzinga, and you haven’t refuted a single one of my six points against dispersalism. Get busy, Sal.

  9. walto,

    At this point, I think there’s probably just keiths, looking for somebody to fight with.

    I don’t have to look very hard. You’ve disagreed with me on each of these nine points:

    walto,

    You asked for a summary:

    I think it might be helpful for you to summarize what you think our differences are.

    I gave you one:

    You’ve disagreed with me on every single one of the following points:

    1. Entropy is not a measure of energy dispersal.
    2. Entropy is a measure of missing information regarding the microstate.
    3. Entropy is observer-dependent.
    4. Entropy is not a function of the system alone.
    5. Entropy is a function of the macrostate.
    6. The entropy is zero for an observer who knows the exact microstate.
    7. The second law is not violated for such an observer.
    8. Stipulating an observer-independent entropy forces it to be zero for everyone, rendering it useless for entropy comparisons.
    9. The ‘missing information’ interpretation of entropy works in all cases.

    I’m sure I’d find more if I were to reread the thread.

    You seem to have reversed your position on #1. As far as I can tell, you now agree that a) entropy is not a measure of energy dispersal, and b) that Lambert’s definition is therefore incorrect. (I’ve asked you directly, but you won’t give a direct answer.)

    What about the others? Do you still disagree with #2 through #9, or have you changed your mind on those, also?

    We can now add a tenth:

    10. Any information that helps us narrow down the exact microstate is “admissible” when calculating entropy.

    After all, if you gain more information about the microstate, then less of it is missing. Less missing information means less entropy.

  10. keiths: If the answer is “none of them”, you should be able to tell us how one could go about determining the correct answer.

    The correct answer is the one that comes from the observer with one piece of information less than Damon.

    All other answers are correct to the extent that they correspond to the correct one

    peace

  11. keiths: No, because Damon learns the exact microstate by observing it, and humans learn the order of a randomly shuffled deck the same way — by observing it. Either way, it has nothing to do with “knowing everything about yourself”.

    A deck of cards in which you know the order is associated with you by observation. In a sense It’s “your” deck. To know the order of this particular deck among all possible ones we need to know who you are.

    keiths: You can define it as one, in which case the other microstates would be created by cutting and splicing. I have no idea why you think a video of a tornado is relevant, however.

    Because according to this perspective we are entitled to say that structures don’t move from less organization to more organization and Sewell has a point. contrary to what you asserted earlier.

    peace

  12. Mung: keiths, do you agree that the microstate is changing, even at equilibrium?

    I like this question. What sort of equilibrium?

    Damon has been specified only as a guy who “knows everything.” Well, then, for all we know, every microstate may always be changing from Damon’s perspective, even if the temperature is at zero Kelvin. Maybe microstates undergo forever undetectable-by-human changes (say, half the electrons start singing very softly in Mandarin for occasional three-second intervals) at zero degrees. {The intervals are irregular but the duration is always three seconds from wherever Damon is perched! And the words, translated from Mandarin to Damonic to English are usually, but not always, “You’re the one that I want.”}

    There’s an interesting entailment there!

    One other thing worth noting: This is a false quadiotomy regarding thermodymic entropy:

    1. Xavier is right when he calculates it.
    2. Yoland is right when she calculates it.
    3. Damon is right using all the information in the universe about everything and assuming whatever he doesn’t know about every microstate is zero.
    4. We can say precisely what the correct one is.

    We get closer as our equipment gets better, but, because of the limitations of measurement abilities, we can never get it precisely “right.”

    That’s roughly what FMM is saying but his “One missing piece” business doesn’t make any sense to me.

  13. stcordova:
    [catastrophic <gggg> description of Gibbs snipped]
    So for all of Keiths and DNA_jock saying Lambert is wrong, they can’t even get on first base for many experimental applications without energy dispersal parameters. Keiths and DNA_jock are too much of dabblers to contend with Frank Lambert and Dr. Mike who actually taught these subjects in University. They’re just eager to disagree with me, especially DNA_jock who resorts to misreading and grabbing at straws in attempt to claim I don’t understand this stuff.

    Keiths disagreed with Dr. Mike as well, who was a professional thermodynamicist working on extremely cold refrigerators.

    I pretty much diss the Keiths, DNA_Jock school of entropy. I can’t imagine one will go very wrong following Frank Lambert and Dr. Mike.

    Sal, I never said Lambert or Elzinga were wrong. I pointed out that their view relies on a particular get-out-of-jail free card, which is needed to answer the red vs ever-so-slightly-darker-red gas mixing paradox.
    You, still, show no sign whatsoever of understanding their position, and the assumption they rely on.
    I find it somewhat awkward that when you move past basic High School (Clausius) concepts of entropy, you start making conceptual howlers.
    Where does the specific heat capacity of copper come from?
    Your temperature = f(added energy) plot shows negative temperatures; please explain in your own words how that arises…

  14. walto,

    I just wanted to make it clear that I think Damon is CAPABLE of calculating the correct thermodynamic entropy. It’s just that he won’t if he uses unrestricted Shannon information. With the extra, irrelevant knowledge he instead calculates a Shannon entropy (always zero), but it won’t be thermodynamics, and it’s a useless quantity.

  15. Adding to the thoughts where I related Boltzmann to information theory and “ignorance theory” here:

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-17/#comment-147073

    I earlier provided an excel spreadsheet where one can use the Sakur Tetrode approximation to calculate absolute entropy and the absolute number of microstates for an ideal monatomic gas like helium. Here is the spreadsheet again.

    http://creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    If you put

    1 / 6.0221409 x10^23

    in the cell for number of moles, then the spreadsheet will calculate the entropy for a single particle.

    One will note that if the volume is increased , the entropy (and hence number of microstates) increases. If one shrinks the volume, the entropy (and hence number of microstates) decreases.

    And also, if the temperature is increased, the entropy (and hence number of microstates) increases. If one lowers the temperature, the entropy decreases.

    So increasing the temperature, and hence velocity, and hence kinetic energy of the particle, results in the number of microstates increasing and hence entropy increasing.

    So now I’ve provided an example of how change in temperature and volume change the number of microstates in a system.

    So it is easy to state the change in entropy by saying things like, “increases in temperature and/or increases in volume increase the number of microstates, therefore increase entropy.”

    It is rather convoluted, although not necessarily inaccurate, to say our ignorance of the system increases as volume and temperature increases, therefore entropy increases. Or to say, the information required to specify each microstate increases as volume and temperature increases, but this hardly succinct, clear and borders on creating un-necessary word salads. If one wishes to sound profound about trivial things, this is one way to go about it. Personally, I don’t think it is that helpful.

    When we say “ignorance” we mean to say, we are not sure specifically which microstate the particle is in. When there are more microstates, we become even more uncertain which microstate the particle is in, hence we can say we increased “ignorance” of the system by increasing temperature or volume.

    But that’s an awfully convoluted way of saying things! Another convoluted way of saying things is to describe the “ignorance” in terms of information theory. When we have a lot of ignorance (or microstates) we can say to that we need X number of bits of information to exactly specify the microstate the particle is in. But that is yet again another convoluted way of saying things. If one wishes to sound profound by making much ado about nothing, invoking information theory just to count microstates is a way to do business. I’d don’t like doing business this way, maybe Kairos Focus at UD likes doing business that way, I don’t.

    We can compute change of entropy for a single helium atom in 1 cubic meter at 300K ( as I calculated earlier). If we described entropy in terms of ignorance, we say the entropy of single helium atom at 300 K in 1 cubic meter of volume results in:

    106.23 bits of ignorance

    or equivalently each microstate has 106.23 bits of information needed to sufficiently specify it.

    But this again is a convoluted way of just saying: “the particle exists in 1 of the 9.52 x 10^31 possible microstates.” No need to get into a deepity woo, nebulous description of entropy.

    In principle, I suppose an observer can insist on a new method of counting microstates that go beyond heat, volume and temperature.

    The number of microstates is observer-dependent only in the sense of how microstates defined.

    If much of the industry defines entropy in terms of heat and temperature, then as far as that definition of entropy, entropy isn’t ultimately observer-dependent, it is environment-dependent. As long as people agree on a convention, the parties using that convention are expected to compute the same amount of entropy.

    If they wish to define entropy by a different set of observables, then this is a different change in conventions, it’s not like there is some purely subjective quality of physical phenomenon.

    If people choose to see photos in black and white, they agree on a convention in the way they perceive the world through those pictures. It doesn’t mean the world inherently lacks color. That’s all there is to it as far as observer dependent definitions of entropy. As long as the conventions by all parties are known, they should get objective results.

  16. stcordova,

    Again, it seems you and Albert agree 🙂

    In 1904, and thereafter, German physicist Albert Einstein began to repeatedly criticize the formula S = k log W; in 1910, Einstein had the following to say on the matter:

    “The equation S = k log W + const appears without an elementary theory—or however one wants to say it—devoid of any meaning from a phenomenological point of view.”
    — Albert Einstein (1910), popular 2007+ re-quote version

  17. colewd,

    More on Einstein’s view.

    “Usually W is set equal to the number of ways (complexions) in which a state, which is incompletely defined in the sense of a molecular theory (i.e. coarse grained), can be realized. To compute W one needs a complete theory (something like a complete molecular-mechanical theory) of the system. For that reason it appears to be doubtful whether Boltzmann’s principle alone, i.e. without a complete molecular-mechanical theory (Elementary theory) has any real meaning. The equation S = k log W + const. appears [therefore] without an Elementary theory—or however one wants to say it—devoid of any meaning from a phenomenological point of view.”
    — Albert Einstein (1910), Ezecheil Cohen 2005 abbreviated translation

    “Usually W is put equal to the number of complexions…. In order to calculate W, one needs a complete (molecular-mechanical) theory of the system under consideration. Therefore it is dubious whether the Boltzmann principle has any meaning without a complete molecular-mechanical theory or some other theory which describes the elementary processes. Formula. seems without content, from a phenomenological point of view, without giving in addition such an Elementartheorie.”
    — Albert Einstein (1910), Abraham Pais 1982 abbreviated translati

  18. walto: We get closer as our equipment gets better, but, because of the limitations of measurement abilities, we can never get it precisely “right.”

    That’s roughly what FMM is saying but his “One missing piece” business doesn’t make any sense to me.

    Let me engage in some thought experiment speculation to help me wrap my head around all this. I’m not making claims at this point just exploring

    Lets say Damon reveals the exact micostate of an isolated system to another person “Legolas” at times T0 and again at T1.

    If Legolas were to calculate the increase in entropy Between T0 and T1 using what Damon told him he would come up with a quantity of zero.

    Now suppose Legolas wanted to calculate the increase between T1 and T2

    Legolas doesn’t know the microstate at T2.He can make an excellent guess based on his knowledge he has of the evolving system but there will always be some level of ambiguity possibly from random fluctuation that Legolas will experience as an increase in entropy.

    This is true for any time after T2 as well.

    Damon can only share with Legolas a finite amount of information and the shared knowledge of the microstate is only perfectly valid for a limited time.

    I think I would argue that since Damon is not really doing thermodynamics that Legolas has the best possible measurement of enthropic change at T2. Every other measurement would be correct to the extent it corresponded to that of Legolas.

    Legolas’s answer would be the standard as long as Damon continues to fill him in.

    peace

  19. fifth,

    Lets say Damon reveals the exact micostate of an isolated system to another person “Legolas” at times T0 and again at T1.

    If Legolas were to calculate the increase in entropy Between T0 and TI using what Damon told him he would come up with a quantity of zero.

    Yes, assuming equilibrium at t0 and t1. The number of epistemically possible microstates would be one at both times, so the logarithm would be zero, making the entropy zero. Zero minus zero is zero, so ΔS would be zero as well.

    Now suppose Legolas wanted to calculate the increase between T2 and T3

    Legolas doesn’t know the microstate at T3.

    That depends on Legolas’s computational abilities and whether the system is deterministic. If it is, and if Legolas is up to the task, then he knows the exact microstate at both t2 and t3 (by simulating forward from t1). In fact, he wouldn’t even need Damon to tell him the microstate at t1. All future microstates would be implicit in the description Damon gave him about the microstate at t0, and entropy would always be zero for Legolas.

    If the system weren’t deterministic, then Legolas’s uncertainty about the microstate would increase over time, and his assessment of the entropy would follow suit.

    He can make an excellent guess based on his knowledge he has of the evolving system but there will always be some level of ambiguity possibly from random fluctuation that Legolas will experience as an increase in entropy.

    Yes, assuming a non-deterministic system. Legolas’s uncertainty could also increase due to inadequate computational resources.

    I think I would argue that since Damon is not really doing thermodynamics…

    I think it’s silly to argue that Damon isn’t doing thermodynamics. He’s calculating the thermodynamic entropy, after all. The fact that the value is zero is not disqualifying. Someone calculating an entropy at absolute zero will also get a value of zero, but that doesn’t mean that s/he isn’t doing thermodynamics.

    …that Legolas has the best possible measurement of enthropic change at T3. Every other measurement would be correct to the extent it corresponded to that of Legolas.

    That doesn’t make sense. If more information gives better, more accurate entropy values, as you are arguing, then the observer with the greatest possible information about the microstate — Damon — has the best answer for the entropy value: zero.

    Second, your thought experiment doesn’t place any restrictions on t0-t3 (beyond the implicit assumption that the names reflect the order). Suppose t2 and t3 are millions of years later than t1, and that the system is non-deterministic. Would you seriously argue that Legolas has “has the best possible measurement of entropic change at T3”? Obviously not.

  20. colewd,

    Why are you so obsessed with what Einstein thought about Boltzmann’s equation more than a hundred years ago?

  21. keiths:

    If the two of you are correct about entropy being observer-independent, you should be able to tell us who (if any) of Xavier, Yolanda, and Damon gets the correct value of entropy, and why.

    If the answer is “none of them”, you should be able to tell us how one could go about determining the correct answer.

    fifth:

    The correct answer is the one that comes from the observer with one piece of information less than Damon.

    Exactly how big is a “piece” of information in that context, and how do you justify your answer? Is it the same for all systems in all states? If so, do you realize that it renders entropy just as useless as it would be if it were zero all the time?

  22. fifth:

    To know the exact microstate would entail knowing everything about ourselves. I don’t think that is possible.

    keiths:

    That’s as silly as saying that I can’t know the order of a deck of cards because I “don’t know everything about myself”.

    fifth:

    No, It’s like saying that you can’t know the order of a shuffled deck of cards if no one has observed them.

    keiths:

    No, because Damon learns the exact microstate by observing it, and humans learn the order of a randomly shuffled deck the same way — by observing it. Either way, it has nothing to do with “knowing everything about yourself”.

    fifth:

    A deck of cards in which you know the order is associated with you by observation. In a sense It’s “your” deck. To know the order of this particular deck among all possible ones we need to know who you are.

    No, we don’t, and we certainly don’t need to know “everything about me”, as you claimed.

    Come on, fifth.

  23. keiths: He’s calculating the thermodynamic entropy, after all.

    Excellent question beg!

    Gell-Mann–Yes!
    Einstein–No!

    As you’re voting, here’s a quote from E.T. Jaynes (1965):

    There is no end to this search for the ultimate “true” entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics.

  24. I will say, though, that I think it’s too strong to say that the whole “notion of entropy collapses”. Shannon entropy is still operative in that limiting case. But the notion of thermodynamic entropy certainly goes bye-bye.

  25. walto,

    It’s odd that you’d cite Jaynes on this, since his quote undermines the argument you’ve been making.

    More later. In the meantime, think about it.

  26. keiths,

    Because nothing has changed to legitimize Boltzman’s misleading equation. Until you can find a way to model and measure W, Einsteins comments are still valid.

  27. Walto:

    As you’re voting, here’s a quote from E.T. Jaynes (1965):

    There is no end to this search for the ultimate “true” entropy until we have reached the point where we control the location of each atom independently. But just at that point the notion of entropy collapses, and we are no longer talking thermodynamics.
    ,

    Just a comment regarding the Jayne’s quote.

    It is remarkable that the Clausius notion of entropy defined purely in terms energy and temperature is related to the Boltzman notion of entropy in terms of microstates. This is a counterintuitive relationship, and it is possible that Clausius himself didn’t realize this at the time.

    As a student learning these things, I was always confused to see the apparently conflicting definitions of entropy (Clausius and Boltzmann) and then it was much later I saw they were equivalent (for the most part anyway for practical applications).

    Going back to the single particle in a container:
    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-17/#comment-147073

    Suppose we had the single helium particle in 2 cubic meters of volume at 300K. Using the Sakur-Tetrode approximation, the absolute entropy (using the excel spreadsheet I provided) is:

    entropy at 2 cubic meters = 1.026186 J/K x 10^-21

    entropy at 1 cubic meters = 1.016616 J/K x 10^-21

    The change in entropy when we compress the volume the particle occupies at the same temperature from 2 cubic meters to 1 cubic meter is:

    entropy change during compression is = 9.5699 x 10^-24 J/K

    That means if we do this compression at 300K , we have to expend ON AVERAGE:

    9.5699 x010^-24 J/K x 300K = 2.87098 x 10^-21 J

    of energy to confine the particle to a smaller volume during an isothermal (same temperature) process.

    The moral of the story, it takes energy to reduce uncertainty.

    Energy must ON AVERAGE be expended to reduce the number of Boltzmann/Gibbs microstates isothermally. By reducing the number of microstates we reduce the uncertainty in the details of the particle in terms of the phase space coordinates ( X, Y, Z, P_x, P_y, P_z) . Reducing uncertainty is increasing information we have of the system..

    But as I said, I don’t like using uncertainty, ignorance, information theory to describe entropy. It is a convoluted description of what entropy is. That is why I focused on numerical examples to help de-mystify what entropy is.

    Now, one will ask why does it take energy to reduce entropy in an isothermal context for this single helium atom getting compressed into a smaller volume? Suppose we do this simply by compressing a piston that is at 2 cubic meters down to 1 cubic meter. As the piston moves, it will have a velocity associated with it and hence as it collides with the helium atom it will add to the helium atom’s velocity ON AVERAGE, hence increasing the energy of the helium atom, and that is the energy that is expended to get the helium into 1 cubic meter.

    If we scale up the above example to 1 mole of helium atoms, the change in entropy is the familiar 5.76 J/K in previous examples.

    That means at 300K when we compress 1 mole of helium from 2 cubic meters to 1 cubic meter, it will take

    5.76 J/K x 300K = 1728.8 J

    of work/energy to compress the helium isothermally. Also, that is how much heat must be dumped out during the compression process (also 1728.8 J of heat).

    So to reduce the amount of entropy in a system, we dump off heat. Dumping off heat reduces uncertainty (ignorance) about the details of the system, but this is a convoluted way of looking at things, imho.

    One usually can’t start with a description of our ignorance of a system, we have to start off with things like heat and temperature and then compute ignorance or some description of information. Hence, for most practical applications, one doesn’t frame entropy with information theory, since it takes data on temperature (measured by thermometers) and heat (measured by calorimeters, or indirectly by scales, barometers, whatever) to compute the amount of ignorance or information in the system in the first place. We are thus stuck with taking inventory of energy dispersal, and that’s why Keiths can’t calculate entropy using his purely information theoretic approach without data on energy of the system in the first place!

    So I think all this information theory definition of entropy is a over-hyped, much ado about nothing that only has the veneer of something profound, but is basically empty, and for practical applications in engineering and chemistry, it’s a non-starter. Entropy for most practical applications is pumping of heat in and out of a system at a given temperature. Even the mixing entropy can be framed in terms of heat pumping in and out of system at a given temperature to REVERSE or undo the mixing.

    Mixing of two gases may not involve the addition of heat or expenditure of work, but un-mixing will, hence mixing entropy can be framed in terms of energy dispersal from that perspective.

    PS

    Keiths is on my ignore list. I will however address Keiths comments if you quote them and request I respond. At this point I find direct exchanges with him not worth an investment in time.

  28. We’re approaching 900 comments. Much of the fight involves this comment by Keiths

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-1/#comment-144975

    Keiths:
    The energy dispersal doesn’t change during the mixing process. Energy is evenly distributed throughout both volumes both before and after mixing.

    But Q-reversible is the energy required to unmix the system! There will be heat dispersed into the environment to reverse the process.

    It’s not about the homogenous distribution of energy before and after mixing. And as pointed out, this is an erroneous description if one gas is monoatomic and the other is diatomic! Keiths is so wrong on this, and thus DNA_jock and Mung are just as much wrong for agreeing with Keiths!

    Keiths, Mung, and DNA_jock have been refuted via numerous angles, and here is yet another one.

    delta-S = Integral ( Q-reversible/T )

    Q-reversible during isothermal expansion of a mole of gas from 1 cubic meter to 2 is also the same magnitude of Q-reversible in the case of one mole of ideal gas being compressed from 2 cubic meters back to 1 cubic meter.

    Hence the word “reversible” is appended to Q. The magnitude of Q-reversible for 1 mole at 300K when expanding from 1 to 2 cubic meters, or compressing from 2 to 1 cubic meter isothermally is:

    1728.94 J

    See:
    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node33.html

    The entropy change then during a free expansion at 300K is just taking Q-reversible and dividing by temperature:

    delta-S = Q-reversible/T = 1728.94 J/ 300K = 5.76 J/K

    For 2 ideal gases mixing, this is like compressing 2 separate gases from 2 cubic meters down to 1 cubic meter.

    The mixing entropy is thus:

    (5.76 x 2) = 11.52 J/K

    as would be calculated using other approaches.

    Worse for Keiths, he needs the energy dispersal parameters to compute the number of bits in his observer-dependent information theory approach.

    Entropy as “dispersal of energy” is a misconception. Let it go, Sal.

    No. And I’ve defended Lambert’s viewpoint. In contrast, you can’t even get to first base with your “entropy is a measure of ignorance” approach without utilizing energy dispersal parameters, like say for a melting ice cube. Too funny.

    All Keiths has to defend his claims are equivocations of the meaning of “energy dispersal”. He totally mischaracterizes the intended meaning of energy dispersal, and can’t even get the facts right.

    Energy is evenly distributed throughout both volumes both before and after mixing.

    That is not true in general, like the case where one gas is monoatomic and the other diatomic! That’s a clueless comment that hasn’t been retracted to my knowledge because I’ve gotten tired of reading what Keiths is writing.

  29. Oh, I see why colewd keeps bringing up Einstein. He actually thought Einstein was supporting Sal’s position.

    colewd, to Sal:

    Again, it seems you and Albert agree 🙂

    Um, no, colewd.

  30. How entropy can be used as a practical analysis tool of steam or other type engines can be seen from the familiar example of isothermal expansion of an ideal gas.

    As shown above, if we let gas expand isothermally, using the entropy change of

    5.76 J/K during the expansion, this figure can be used to calculate the amount of lost “potential energy” of the gas if we allowed free expansion instead of using the gas to push a piston. The work a piston during an isothermal expansion would be able to do is:

    entropy change x temperature =

    delta-S x T =

    5.76 J/K x 300K = 1728.94 J of work

    By letting the gas expand freely, we lose the ability to do that much work.

    Hence the entropy change can be used to calculate the lost opportunity to do work. This shows the practical application of the concept of entropy, and as pointed out, in many cases the practical application can be realized without any reference to Keiths information theory but by simply using temperature and heat data.

    When someone says entropy is a measure of our ignorance of a system, I have to respond by asking “how do you measure ignorance, do you have an ignorance meter? I think thermometers and other instruments are better for the job than non-existent ignorance meters.”

  31. Sal,

    Keiths is on my ignore list. I will however address Keiths comments if you quote them and request I respond. At this point I find direct exchanges with him not worth an investment in time.

    Lol. Someone should quote my six points against dispersalism for Sal to answer. I think they were what scared him into putting me (or pretending to put me) on ignore in the first place.

    Here they are again, for convenience.

    From earlier in the thread, six reasons why the dispersalists are wrong:

    1. Entropy has the wrong units. Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

    In the Xavier/Yolanda example, Yolanda possesses equipment that allows her to distinguish between the two isotopes of gas X, which we called X0 and X1.

    If she chooses to use her equipment, she comes up with a different macrostate than Xavier and calculates a different entropy value. If she declines to use the equipment, she and Xavier come up with the same macrostate and get the same entropy value. Her choice has no effect on the actual physics, and thus no effect on the actual energy dispersal. Yet her choice does affect the entropy value she measures, and profoundly so.

    She is not measuring energy dispersal. She is measuring the information gap between her macrostate and the actual microstate.

  32. Sal:

    When someone says entropy is a measure of our ignorance of a system, I have to respond by asking “how do you measure ignorance, do you have an ignorance meter? I think thermometers and other instruments are better for the job than non-existent ignorance meters.”

    This idiotic statement is from a self-appointed “teacher” of entropy to fellow creationists and other IDers.

    Walto, since you’ve come around to accepting the missing information interpretation of entropy, how about answering Sal’s question? I’ll offer assistance if needed.

  33. Hi Sal,
    Entropy is a state function.
    Take two identical two-chambered vessels, with barely distinguishable gases on each side.
    For one vessel, remove the partition and allow the gases in the two chambers to mix thoroughly.
    We all agree that the entropy has increased by ~ N kB ln(2).
    Now, pump ALL of the energy out of both vessels.
    Now the “energy dispersal” in the two vessels is IDENTICAL.
    But the entropy still differs by N kB ln(2).
    Oh dear.

  34. colewd,

    Thank you for the kind words.

    I’m basically ignoring 3 of the commenters (Mung, Keiths, DNA_jock) because they got boring and repetitive and didn’t offer much insight. They can’t even compute their information theoretic claims for college chemistry exams without energy dispersal information, and even when they have it staring them in the face, they equivocate the usage of the data.

    But if you have any specific question, even issues that they raise, I can attempt an answer. But unless you or Walto want me to address something they raise, they will be ignored. I find it pointless to argue with these self-appointed experts of entropy who don’t even bother trying to do calculations and who persistently disagree with professional thermodynamicists like Dr. Mike who used to participate here. Added to that, Keiths made an elementary error with this statement for which I have not seen any retraction:

    Energy is evenly distributed throughout both volumes both before and after mixing.

    Only in specific case, but not true in general. In otherwords, un-informed baloney. That will not hold for the mixing of diatomic and monoatomic gases since the energy will be different before mixing in such cases. DNA_jock ignorantly said Keiths is correct. I really have not much desire to trifle with such self-appointed experts who can’t grasp such basic things and just spew and bloviate for hundreds of comments.

    Thank you for the Einstein quote.

    The Clausius version of entropy is the most common denominator and does not require information theory to apply. It is the easiest to measure and apply, imho even though it may not be as comprehensive a theory, it works from an empirical standpoint. The Clausius version can often work even without the assumption of the existence of atoms.

    Sorry you have to see a less diplomatic side of me, but I get tired of dealing with certain individuals at TSZ.

  35. Keiths from one of his first comments in this thread:

    Energy is evenly distributed throughout both volumes both before and after mixing.

    As pointed out, Keiths made a terrible blunder. This will not hold if one gas is monoatomic and another is diatomic because the diatomic gas has more internal energy per mole.

    Incidentally, the differing internal energies are indicated by the fact diatomic and polyatomic gases having higher specific heats than monoatomic gases.

    There is an important subtlety to note.

    Even though diatomic and monoatomic gases have different internal energies, to the extent they still approximate ideal gases which obey:

    PV = nRT

    the Q-reversible numbers will be identical for monotomic and diatomic gases in the case of isothermal free expansion that is used to construct mixing entropy. So in the case of mixing entropy, there is no need to account for the gases being monoatomic or diatomic or polyatomic as long as they are sufficiently well approximated by the ideal gas law.

    Nevertheless this statement by Keiths is wrong, and it shows lack of understanding. The fact DNA_jock said, “Keiths is correct” also shows lack of understanding ond DNA_jock’s part.

    Energy is evenly distributed throughout both volumes both before and after mixing.

    Baloney for mixing monoatomic and diatomic gases since they have different internal energies before mixing.

    See:
    http://hyperphysics.phy-astr.gsu.edu/hbase/kinetic/shegas.html

    Selected Specific Heats

    The models of constant-volume specific heat based on equipartition of energy and including rotational degrees of freedom as well as translational are able to explain specific heats for diatomic molecules. The departure from this model in the case of polyatomic molecules indicates vibrational involvement.

  36. Sal has claimed that the reason he posts here is to hone his pedagogical skills for the teaching of young creationists.
    Does anyone find it strange then, that when he is faced with an argument he doesn’t know how to respond to, his reaction is to stick his fingers in his ears and chant “I can’t hear you!”? One would have thought that that isn’t…
    Oh.
    Nevermind.
    I’ll get me coat.

  37. DNA_Jock:
    Hi Sal,
    Entropy is a state function.
    Take two identical two-chambered vessels, with barely distinguishable gases on each side.
    For one vessel, remove the partition and allow the gases in the two chambers to mix thoroughly.
    We all agree that the entropy has increased by ~ N kB ln(2).
    Now, pump ALL of the energy out of both vessels.
    Now the “energy dispersal” in the two vessels is IDENTICAL.
    But the entropy still differs by N kB ln(2).

    Sal, would you care to respond to this? Also to Jock’s last post regarding (IIRC) copper? (If you don’t have it and can’t get to it, l can find it and repost it.)

  38. DNA_Jock:

    Sal has claimed that the reason he posts here is to hone his pedagogical skills for the teaching of young creationists.
    Does anyone find it strange then, that when he is faced with an argument he doesn’t know how to respond to, his reaction is to stick his fingers in his ears and chant “I can’t hear you!”?

    I suspect he’s just pretending to stick his fingers in his ears.

    He can’t figure this stuff out on his own, so he comes to TSZ and gets taught by people who understand it far better than he does. If he pretends to have us on ignore, he can learn from us without admitting it or acknowledging his errors.

    It’s pretty lame. Even colewd ought to be able to see through it.

    Hey colewd,

    How about posting this comment of mine for Sal to answer?

  39. Meanwhile, Sal is latching onto diatomic gases, hoping they will rescue him from his predicament.

    They can’t.

    If entropy were a measure of energy dispersal, then wherever entropy increased, energy dispersal would also necessarily increase, whether the gases involved were monatomic or diatomic.

    It doesn’t, and even Sal can see it:

    If Gas A is helium and Gas B is neon, does the entropy increase after they mix? Answer: Yes.

    Is there a change in average energy per unit volume in the system even though there is a change in entropy? Answer: No.

  40. Walto:

    Sal, would you care to respond to this? Also to Jock’s last post regarding (IIRC) copper? (If you don’t have it and can’t get to it, l can find it and repost it.)

    Regarding the specific heat of copper, a good enough value can be derived by a college science experiment to that effect. It may be instructive to see the procedures.
    http://srjcstaff.santarosa.edu/~lwillia2/41/41copperlab.pdf

    The laboratory apparatus:

    Computer, LoggerPro software, Vernier LabPro, temperature probe, calorimeter with stirrer, electronic scale, Copper shots, hot plate, aluminum boiler and sample container, and digital thermometer. Ice bucket

    Ok, that said. Regarding how the absolute entropy of copper was estimated, I actually don’t have exact data, but I can offer a guess.

    We can in principle cool the copper to really really low temperatures and then heat it up. Using the Clausius integral, we can then affix an entropy figure at every temperature as we heat it up. It’s not that hard. As the wiki entry said

    However, this presupposes that the material forms a ‘perfect crystal’ without any frozen in entropy (defects, dislocations), which is never completely true because crystals always grow at a finite temperature. However this residual entropy is often quite negligible.

    https://en.wikipedia.org/wiki/Standard_molar_entropy

    At some point, there won’t be much gained by getting that much closer to absolute zero, so at some point an educated estimate will probably suffice.

    As far as chemical applications, we can achieve a good enough standard figure.

    At some point, DNA_jock will eventually find questions to which I have no answer, and I take offense to just going down rabbit trails just to show that I don’t know something, and that’s easy to do because I know so very little. At issue is Keiths statements reagarding energy dispersal or spreading.

    Prior to Boltzmann, even in the time prior to Clausius, we could determine specific heats. Published specific heats for solids are valid for certain ranges.

    For copper I got the figure of .39 from:
    http://www.engineeringtoolbox.com/specific-heat-metals-d_152.html

    Copper 0.39 kJ/kG K

    which is equal to 0.39 J/gram/K.

    DNA_jock asked if this is empirically determined. The answer is that it can be. And rather easily as shown above.

    We can with thermometers estimate the amount of energy a heat source generates. Shown below is one such device from 1783:

    https://en.wikipedia.org/wiki/Calorimeter

    Now all we have to do is measure how much heat is added to a quantity of copper, note how much temperature changes as we add heat, and we can compute a specific heat. It doesn’t take impenetrable math to carry out the experiment. In the time of Clausius they were able to compute specific heats of various substances (especially water) and were able to compute entropy without all the sophistication of Boltzmann/Gibbs and later on Shannon.

    I ignored DNA_jock because I found the line of questioning about specific heats annoying and trivial and irrelevant. If he had objections to the numbers I put forward, he can provide his alternate numbers and show us how he calculates the change in entropy of a 1555 gram copper block going from 298K to 373K or whatever. I resent the rabbit trail. If he objects to my computation and figures, he can provide his own and show me up rather than try to derail the conversation.

    Take two identical two-chambered vessels, with barely distinguishable gases on each side.
    For one vessel, remove the partition and allow the gases in the two chambers to mix thoroughly.
    We all agree that the entropy has increased by ~ N kB ln(2).
    Now, pump ALL of the energy out of both vessels.
    Now the “energy dispersal” in the two vessels is IDENTICAL.
    But the entropy still differs by N kB ln(2).

    Another rabbit trail and confusion. First off it is 2 N kB ln(2) for 2 gases, he can’t even get his formulas straight!

    How does DNA_jock define barely distinguishable? Unless we’re talking gases that are already mixed (like air), that’s kind of a joke. Is hydrogen barely distinguishable from C02? How about DNA_jock give and example of barely distinguishable? Are we talking simply about the lack of experimental apparatus or in principle (which is then pretty much meaningless).

    The line of questioning, imho, is designed to trip up and confuse the issue.

    Let me know if I can elaborate more on what I said in my response.

    DNA_jock and Keiths can now defend this silly claim:

    Energy is evenly distributed throughout both volumes both before and after mixing.

    I’ve shown that is not always true. And Keiths notions of energy dispersal are an equivocation and redefinition of energy dispersal that is embedded in the Clausius definition of entropy. Q-reversible is the energy dispersal. I also provide alternate definitions of energy dispersal earlier that also give the same results.

    But I’ve not seen Keiths admit his mistake or DNA_jock either. I didn’t harp on it until now because I wanted to provide more comments to hopefully educate the readers, like calculating the entropy change in a melting ice cube, which Keiths and DNA_jock can’t seem to do since they can’t describe the uncertainty and ignorance without first using the energy dispersal (aka Q-reversible) parameters!

    FWIW,

    A 20 gram ice cube melting has a change of entropy of:

    6660 J/ 273 K = 24.39 J/K = 24.39 J/K / kB = 24.39 J/K / (1.381x 10^-23 J/K) =

    1.7665 x 10^24 natural bits or

    1.7665 x 10^24 x (1/ ln(2) Shannon Bits/ natural bits) =

    2.5485 x 10^24 Shannon bits

    So the change in entropy of a melting ice cube is an increase of “ignorance” or a decrease in “information” by 2.5485 x 10^24 bits.

    But this is just scaling numbers and calling the results in bits! There is nothing that profound here, it only looks profound to the un-initiated. It’s a convoluted way of looking at entropy.

    Furthermore, Keiths can’t even get to that number of information bits without using the energy dispersal parameter (Q-reversible of 6660 Joules). This deepity woo information theoretic approach doesn’t add much to the insight for most engineering and chemistry applications, it just adds confusion and gratuitous incorporation of information theory and mystical woo.

    If Keiths can calculate the entropy change of a melting ice cube without reference to Q-reversible (the energy dispersed into the ice) let him. Let him show how he derives his ignorance computation without reference to energy dispersal. I was able to compute his ignorance and provide the number of Shannon bits because I compute entropy first using energy dispersal. He can’t seem to do the same, all he has are claims of ignorance for which he must appeal to non-existent ignorance meters.

    I on the other hand can appeal to data from thermometers and calorimeters.

  41. Salvador asks how one can measure “information.” Ask Claude Shannon, Sal.

    Salvador asks how one can measure “missing information.” Ask Claude Shannon, Sal.

    I toss a fair coin and you don’t know whether it landed heads up or tails up. Can we quantify the amount of “missing information”? Yes, we can.

    Toss three coins. Can we quantify the amount of “missing information”? Yes, we can.

    (1/2), (1/2), (1/2)
    HHH, HHT, HTH, HTT, THH, THT, TTH, TTT
    log 8 = 3

    By the way, I’d like to know what this calculation has to do with the number of macrostates. Anyone?

  42. …this presupposes that the material forms a ‘perfect crystal’ without any frozen in entropy (defects, dislocations)…

    LoL. Frozen in entropy. I love it!

  43. Walto, thanks for the help getting Sal to unplug his ears, if only partially and temporarily.
    So Sal realizes that the specific heats he uses in all his Clausius entropy calculations are experimentally derived. One step closer to realizing that Clausius, like Carnot before him, is pure phenomenology. He needs Boltzmann/Gibbs to actually explain what’s going on. As Rutherford would put it, he’s indulging in “Stamp Collecting”!
    Sal continues with the GSWs to the foot.

    How does DNA_jock define barely distinguishable? Unless we’re talking gases that are already mixed (like air), that’s kind of a joke. Is hydrogen barely distinguishable from C02? How about DNA_jock give and example of barely distinguishable? Are we talking simply about the lack of experimental apparatus or in principle (which is then pretty much meaningless).

    Helium and Neon, or Oxygen and Nitrogen.
    Defining ‘barely distinguishable’ is a problem for Sal, as he still needs to explain WHY the entropy of mixing is always N kB ln(2) [where N is the total number of molecules, twit], except when the gases are indistinguishable, when it magically drops to zero. (As I have noted earlier, Lambert and Elzinga are fully aware of why this is the case, but Sal has yet to find the get-out-of-jail card… until then, there is no reason to believe he understands anything about TD.)

    The line of questioning, imho, is designed to trip up and confuse the issue.

    No, it is designed to trip up and confuse Sal. Working gangbusters, as evidenced by his refusal to explain how two containers devoid of energy can have different entropies.

    DNA_jock and Keiths can now defend this silly claim:

    Energy is evenly distributed throughout both volumes both before and after mixing.

    I’ve shown that is not always true.

    I have no desire to defend a claim I never made. Sal is absolutely correct: if we have Neon in one half and Nitrogen in the other half, then there is more energy on the Nitrogen side, whereas with Neon on both sides, the energy is evenly distributed before mixing. Clearly, the dispersal of energy is different in these two cases.
    In both cases, after mixing, the energy is evenly distributed. Therefore, per Sal-entropy, the entropy of mixing must differ between these two cases. And yet it does not.
    Oh dear, another GSW to the foot for Sal-entropy.
    EFtypo
    ET correct: Neon and Helium in the second case obviously – with Neon and Neon, delta E drops to zero…

  44. DNA_Jock,

    Walto, thanks for the help getting Sal to unplug his ears, if only partially and temporarily.
    So Sal realizes that specific the specific heats he uses in all his Clausius entropy calculations are experimentally derived. One step closer to realizing that Clausius, like Carnot before him, is pure phenomenology. He needs Boltzmann/Gibbs to actually explain what’s going on.

    How does Boltzmann/Gibbs better explain whats going on?

  45. colewd,

    The more different ways you have to distribute the energy, the more likely that you will find yourself in such a macrostate. It’s merely the law of large numbers. Toss 500 coins: all 2^500 possible sequences are equally unlikely, but there are many more ways to get between 235 and 265 Heads than there are to get between 0 and 30…

  46. DNA_Jock,

    The more different ways you have to distribute the energy, the more likely that you will find yourself in such a macrostate. It’s merely the law of large numbers. Toss 500 coins: all 2^500 possible sequences are equally unlikely, but there are many more ways to get between 235 and 265 Heads than there are to get between 0 and 30…

    What does this tell us practically? You can get a probably distribution of outcomes of the state of a gas but what do you do with that number?

  47. It explains WHYHeat cannot, of itself, pass from one body to a hotter body

    Sure, for many applications, Clausius’s equations are easier to use. Just as Newton’s equations are easier to use than Einstein’s. Just be sure that you understand the underlying physics sufficiently to know when the approximation becomes inadequate.
    As I alluded to earlier, Newtonian mechanics was sufficiently good for the Apollo missions, but not for GPS.

  48. DNA_Jock,

    It explains WHY “Heat cannot, of itself, pass from one body to a hotter body”

    How does it explain this?

    Einsteins GR equations did predict how a small mass particle (photon) would be followed a curved path in the presents of a large mass (sun). The equations were also experimentally validated.

    I am still struggling with what problem is solved by describing all the possible micro states of a gas.

Leave a Reply