In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Mung,

    In modern thermodynamics one summarizes the Second Law with the statement that there exists a state function, denoted as S and referred to as entropy, which in any spontaneous process occurring in an isolated system always increases. A state function means that when the thermodynamic state is defined – say, by giving the temperature, pressure and composition – the entropy of the system is also defined.

    Are temperature pressure and composition included in the Boltzman equation? If so how are they determined? If not then is the Boltzman equation a correct definition of entropy?

  2. colewd: A state function means that when the thermodynamic state is defined – say, by giving the temperature, pressure and composition – the entropy of the system is also defined.

    That’s key.

    1. Is thermodynamic entropy a subset-of a (Shannon-style) information paucity?–Yes
    2. But is every Shannon-style information paucity a calculation of thermodynamic entropy?–No

    Seems largely a quibble to me, but at least one person on this thread seems to resist 1, and at least one other person on this thread clearly resists 2. Why? Who the hell knows.

  3. That’s key.

    1. Is thermodynamic entropy a subset-of a (Shannon-style) information paucity?–Yes
    2. But is every Shannon-style information paucity a calculation of thermodynamic entropy?–No

    Agree.

    Seems largely a quibble to me, but at least one person on this thread seems to resist 1, and at least one other person on this thread clearly resists 2. Why? Who the hell knows.

    Hope I wasn’t mistaken as any of the referenced individuals. I agree with your assessment, and my posts and comments have reflected that agreement, but maybe not exactly in those words.

    FWIW, my views led to me having sharp disagreement with many in the ID community who constantly conflate all Shannon entropies with thermodynamic entropies.

    The point of the OP was to say the main proponent of this conflation, Granville Sewell, gets a slight defense from me, not because he is right, but his mistake was the result of textbook authors like Lehninger and our very own Larry Moran perpetuating the “entropy is disorder” definition. This is unfortunately due to Boltzmann himself and even to Josiah Gibbs who used MixedUpNess to describe entropy! As physical theory matured, those early notions by Boltzmann and Gibbs (who were pioneers of the discipline of statistical mechanics) were dispensed with.

    The ID community unfortunately tied Boltzmann and Shannon information and hence thermodynamic entropy to Dembski’s specified complexity.

    Even if the Clausius version of entropy might not be as general as Boltzmann, it is the most dominant definition in practice. No one really has microstate-meters to measure Boltzman microstates (and hence Boltzmann entropy), but we do have thermometers and calorimeters to measure Clausius entropy.

    We accept on theory that Clausius entropy is Boltzmann entropy. I don’t know that we have an actual experiment showing this since we don’t really have micrsotate-meters!

  4. stcordova: FWIW, my views led to me having sharp disagreement with many in the ID community who constantly conflate all Shannon entropies with thermodynamic entropies.

    LoL.

  5. stcordova,

    Even if the Clausius version of entropy might not be as general as Boltzmann, it is the most dominant definition in practice. No one really has microstate-meters to measure Boltzman microstates (and hence Boltzmann entropy), but we do have thermometers and calorimeters to measure Clausius entropy.

    We accept on theory that Clausius entropy is Boltzmann entropy. I don’t know that we have an actual experiment showing this since we don’t really have micrsotate-meters!

    I am coming to the conclusion that the Boltzman equation has been a big misleading distraction for physics and cosmology. If we can’t validate experimentally then is conceptual only and not a real hypothesis or definition. Einstein’s criticism is validated.

    Lets take a pot of water on the stove before heat is added and measure the temperature in 10 spots top to bottom and side to side. The temperature is uniform. Then we turn on the heat and make the same 10 measurements. This is the point of the most variation from lowest temperature to highest. Then after 20 minutes when the water is boiling the temperature is uniform again. Is there a change in entropy during this process. What is it?

  6. I am coming to the conclusion that the Boltzman equation has been a big misleading distraction for physics and cosmology. If we can’t validate experimentally then is conceptual only and not a real hypothesis or definition. Einstein’s criticism is validated.

    I suspect the situation is a little more nuanced. Statistical mechanics on the whole was important because at the time of Boltzmann and Gibbs, there was still an active debate about whether atoms existed!

    Boltmann and Gibbs essentially predicted that heat and temperature could be described in terms of atoms behaving like billiard balls that obeyed Netwonian mechanics. So:

    S = kB ln W

    was somewhat an argument in favor of atomic theory. The existence of atoms was then vindication of the basic idea, even though the formula in and of itself is rather too general to be of much use unless it is tied to Clausius.

    Einstein himself was a pioneer of more advanced statistical mechanics, and one class of particle bears his name, particles that obey Bose-Einstein statistics, like the Higgs Boson.

    For the boiling water scenarious, I’d have to dust off some books to give you numerical examples, but I’m confident they can be provided. If I have time, I may work through some sample problems. It should not be that hard.

  7. stcordova: The ID community unfortunately tied Boltzmann and Shannon information and hence thermodynamic entropy to Dembski’s specified complexity.

    Yet Salvador is the only IDist I’ve seen recommending coin tossing and also calculating the thermodynamic entropy for coins.

  8. Mung: Yet Salvador is the only IDist I’ve seen recommending coin tossing and also calculating the thermodynamic entropy for coins.

    Mung, if you keep bashing Sal, you’ll have to stay outside with keiths in the no pbj area.

  9. Hi walto,

    I do love me some pbj. A co-worker has promised me some homemade jam.

    But I’m still confused as to whether Salvador has always agreed and only appeared to be arguing to the contrary throughout this thread, or whether he’s perhaps had a talk with some other people he respects and has had a change of heart.

    Will he now be taking DNA_jock and keiths off ignore? Has he been agreeing with them all along?

  10. Well, I’ve been enlightened myself during this thread, so maybe Sal’s views have similarly evolved. I think keiths remains the only outlier. He would not chant the lines I gave him. Too wedded to his Damon, I think.

  11. : Lectures on Gas Theory
    : Ludwig Boltzmann
    : 1896

    The reverse transition has a definite calculable (though inconceivably small) probability, which approaches zero only in the limiting case when the number of molecules is infinite. The fact that a closed system of a finite number of molecules, when it is initially in an ordered state and then goes over to a disordered state, finally after an inconceivably long time must again return to the ordered state, it is therefore not a refutation but rater indeed a confirmation of our theory. One should not however imagine that two gases in a 1/10 liter container, initially unmixed, will mix, then again after a few days separate, then mix again, and so forth. On the contrary, one finds by the same principles which I used for a similar calculation that not until after a time enormously long compared to 10^10^10 years will there be any noticeable unmixing of the gases. One may recognize that this is practically equivalent to never, if one recalls that in this length of time, according to the laws of probability, there will have been many years in which every inhabitant of a large country committed suicide, purely by accident, on the same day, or every building burned down at the same time, yet the insurance companies get along quite well by ignoring the possibility of such events. If a much smaller probability than this is not practically equivalent to impossibility, then no one can be sure that today will be followed by a night and then a day.

    In any case, we would rather consider the unique directionality of time given to us by experience as a mere illusion arising from our specially restricted viewpoint.

  12. Sadly, given the almost infinite number of ways in which I can be ignorant, deviations from the equilibrium state are far more likely to go unnoticed than to be noticed.

  13. stcordova,

    For the boiling water scenarious, I’d have to dust off some books to give you numerical examples, but I’m confident they can be provided. If I have time, I may work through some sample problems. It should not be that hard.

    Assuming we have a constant heat source this should be something we could estimate at the top of our heads if the definition of entropy was clear. We have this major law in physics that famous cosmologists are using to make theories about black holes and the universe. Does anyone else see a problem here? What is this second fairytale of thermodynamics? How did the science get so screwed up?

  14. colewd: We have this major law in physics that famous cosmologists are using to make theories about black holes and the universe.

    Yes, even top physicists can say silly things. It doesn’t follow that there is no second law or that the second law is incomprehensible. It just means that scientists ought to avoid wild extrapolations for which there is no basis.

  15. Mung,

    Yes, even top physicists can say silly things. It doesn’t follow that there is no second law or that the second law is incomprehensible. It just means that scientists ought to avoid wild extrapolations for which there is no basis.

    I agree with you but when the wild extrapolations become set theory that people use to measure the entropy of a black hole we end up with science being build on a very shaky foundation.

    I think there is a very important discipline called the scientific method that requires repeatable measured results prior to completion that has been missing in science and leading to scientism.

    If we limit entropy to a definition that can be validated by experiment, what is the definition?

  16. colewd,

    Assuming we have a constant heat source this should be something we could estimate at the top of our heads if the definition of entropy was clear. We have this major law in physics that famous cosmologists are using to make theories about black holes and the universe. Does anyone else see a problem here? What is this second fairytale of thermodynamics? How did the science get so screwed up?

    The science is fine. It’s your understanding of the science that is screwed up.

  17. walto,

    I’m interested in your response to this:

    walto,

    Your position is inconsistent. You’ve written:

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for.

    That fits nicely with the Xavier/Yolanda situation. Isotopic concentration is relevant for her but not for him. They see different correct entropy values.

    You contradict that by writing:

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    Which is it?

    Hint: Jaynes is correct.

  18. walto,

    So you agree that this statement of yours is incorrect?

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

  19. colewd: If we limit entropy to a definition that can be validated by experiment, what is the definition?

    Clausius, Boltzmann, and information-theoretic. All agree with the experimental results.

  20. keiths, entropy is not something that is known by observing a specific microstate, and the specific microstate is not something that can be known by observing the macroscopic variables. Perhaps we can all agree to refer to DamonEntropy as entropy* or dementropy.

    ETA: Neither of which ought to be confused with thermodynamic entropy.

  21. Mung:

    keiths, entropy is not something that is known by observing a specific microstate, and the specific microstate is not something that can be known by observing the macroscopic variables.

    Entropy is a measure of the missing information — the information gap — between the macrostate and the exact microstate. The macrostate is just a summary of what is known about the microstate. For Xavier and Yolanda, the gap is large. For Damon, there is no gap. No missing information, zero entropy.

  22. Lets take a pot of water on the stove before heat is added and measure the temperature in 10 spots top to bottom and side to side. The temperature is uniform. Then we turn on the heat and make the same 10 measurements. This is the point of the most variation from lowest temperature to highest. Then after 20 minutes when the water is boiling the temperature is uniform again. Is there a change in entropy during this process. What is it?

    If the temperature is uniform before heating, and the temperature is room temperature (298K), and we are dealing with 1 mole of water, the standard entropy according to NIST and the Shomate Equation is:

    69.15 J/K

    At 20 minutes when the water reaches boiling temperature (assuming we still have 1 mol of water), the entropy is

    86.85 J/K

    Now the difficulty is when the temperature is not uniform. In this phase one has to integrate the entropies at various temperatures at all locations in the kettle. This is not so easy. We can estimate the answer will fall between

    69.15 J/K and 86.85 J/K

    A rough guess is to slice the water into levels and estimate the entropy for each level for the given temperature measurement associated with that level and add up the entropies. This will be a royal pain, but it will yield a reasonable estimate.

    Is there a change in entropy during this process. What is it?

    If we are dealing with 1 mol of water, the change in entropy by raising the temperature from room temperature (298 K) to boiling (373K) is

    86.85 J/K – 69.15 J/K = 17.04 J/K

    for 1 mol

    If one is dealing with different quantities than 1 mol, just scale the figure appropriately for the quantities above.

    See:
    http://webbook.nist.gov/cgi/cbook.cgi?ID=C7732185&Mask=2#Thermo-Condensed

  23. Sal,

    The point of the OP was to say the main proponent of this conflation, Granville Sewell, gets a slight defense from me, not because he is right, but his mistake was the result of textbook authors like Lehninger and our very own Larry Moran perpetuating the “entropy is disorder” definition.

    Now you’re repeating Granville’s mistake, but with a different misconception: entropy as energy dispersal.

    Having dismissed “entropy as disorder” in the OP with words such as “Gag!” and “Choke!”, you”ve made it difficult for yourself to admit that you are also “gagging” and “choking” by adopting the “entropy as energy dispersal” misconception.

  24. I would like for Salvador to address cases where he thinks the order/disorder metaphor fails, since he seems to think so poorly of it.

    Then I would like for Salvador to address how the energy dispersal metaphor works in those cases, since he seems to think so highly of it.

    Finally, I would like to see Salvador apply the SMI approach to each of those cases and explain show how it fails.

  25. stcordova: The point of the OP was to say the main proponent of this conflation, Granville Sewell, gets a slight defense from me, not because he is right, but his mistake was the result of textbook authors like Lehninger and our very own Larry Moran perpetuating the “entropy is disorder” definition. This is unfortunately due to Boltzmann himself and even to Josiah Gibbs who used MixedUpNess to describe entropy! As physical theory matured, those early notions by Boltzmann and Gibbs (who were pioneers of the discipline of statistical mechanics) were dispensed with.

    I seriously doubt that Granville got his ideas about entropy from biochem textbooks. As you say, the “disorder” idea can be traced all the way back to Boltzmann and Gibbs, if not earlier. And your own argument belies your claim that “those early notions … were dispensed with.”

    No pbjs for me today. 🙁

  26. stcordova,

    Thanks Sal

    S° = A*ln(t) + B*t + C*t2/2 + D*t3/3 − E/(2*t2) + G

    This is how entropy is calculated with the shomate equation. How would you calculate it with the Boltzman equation.

  27. Mung,

    Clausius, Boltzmann, and information-theoretic. All agree with the experimental results.

    Can you calculate the problem I gave Sal with Boltzman? Does it agree with Sal’s calculation?

  28. Mung,

    ETA: Neither of which ought to be confused with thermodynamic entropy.

    I think we agree. Sorry I missed this. 🙂

  29. Look, “disorder” is kind of another way of saying “lack of ordering information.” How disordered some group of molecules (or playing cards) is is a function of a particular definition of “order.” So when we say the cards are disordered, we’re saying that we can’t say precisely where the jack of diamonds is. But there are many types of “order” (and even more types of “information). So the Lambertians have said something like, “What we’re REALLY interested in is a measure of energy dispersal: that is the sum and substance of the missing information we care about.” I take it that may be too particular to cover all of thermodynamics.

    But you can see why the whole issue seems like a quibble to me. Why should we care about whether it’s called “disorder” “dispersion” or “missing information”? So far as I can tell, the only problem with dispersion is that it’s too narrow, with information that it’s too broad, and with disorder that it’s too vague.

    Who cares?

  30. walto,

    Look, “disorder” is kind of another way of saying “lack of ordering information.”

    Entropy is not a “lack of ordering information.” It’s the missing information needed to pin down the system’s microstate.

    But you can see why the whole issue seems like a quibble to me. Why should we care about whether it’s called “disorder” “dispersion” or “missing information”? So far as I can tell, the only problem with dispersion is that it’s too narrow, with information that it’s too broad, and with disorder that it’s too vague.

    Who cares?

    No, the problem is that entropy is not a measure of disorder, and it’s not a measure of energy dispersal. It is a measure of missing information — the information gap between the macrostate and the microstate.

    Truth matters in science. You think we should split the difference and go have some pbj’s, but that’s not how science works.

  31. walto,

    I’m still awaiting an example — from you or anyone else — of a scenario in which the “missing information” view of entropy fails.

    We know that “entropy as disorder” fails, as does “entropy as energy dispersal”. Why wouldn’t we prefer a conception of entropy that actually works?

  32. keiths: You think we should split the difference and go have some pbj’s, but that’s not how science works.

    Then science needs to change.

  33. keiths:
    walto,

    Entropy is not a “lack of ordering information.”It’s the missing information needed to pin down the system’s microstate.

    No, the problem is that entropy is not a measure of disorder, and it’s not a measure of energy dispersal.It is a measure of missing information — the information gap between the macrostate and the microstate.

    Truth matters in science.You think we should split the difference and go have some pbj’s, but that’s not how science works.

    As indicated about 50 times already, the problem with the information paucity position is (only) that “information about the microstates” is broader than thermodynamic information. This can be seen best, perhaps, from your Damon example. He knows too much for his calculations to be thermodynamically useful. They’re

    Zero in the morning zero in the evening, zero at supper time. Zero when at zero, zero when boiling too.

    This isn’t a defect in the Shannon/Boltzmann explication of thermodynamic entropy in terms of missing information. That is correct. It’s a defect in your interpretation of what information is relevant.

    Go back and look at all the links I provided above regarding what are thermodynamic variables. The fact that not every variable is a thermodynamic variable is why Jaynes thinks your position is wrong.

  34. keiths,

    I’m still awaiting an example — from you or anyone else — of a scenario in which the “missing information” view of entropy fails.

    Calculate the entropy in the boiling pot like Sal did and I am all in. 🙂

  35. colewd:
    keiths,

    Calculate the entropy in the boiling pot like Sal did and I am all in.🙂

    He’d basically be doing the same calculations after some pointless transformations.

  36. walto,

    I think Mung may have nailed it. There are 2 types of entropy.
    1.thermodynamic
    2.Information

    The second law is about 1.

  37. walto,

    As indicated about 50 times already, the problem with the information paucity position is (only) that “information about the microstates” is broader than thermodynamic information.

    We’re not talking about irrelevant information about the microstates, such as “this particular microstate occurred over Hays, Kansas in 1922”. We’re talking about information that helps us narrow down the thermodynamic microstate. Such information is thermodynamic information. Isn’t that obvious?

    This can be seen best, perhaps, from your Damon example. He knows too much for his calculations to be thermodynamically useful.

    1) If you always know (and can work with) the exact microstates, as Damon does, then you don’t need the concept of entropy.

    2) Even if you did, you could always choose to ‘forget’ the extra information and do the calculations from a position of ignorance.

    3) Damon doesn’t exist, so it doesn’t matter whether entropy is useful to him. It’s useful to us.

  38. colewd:
    walto,

    I think Mung may have nailed it.There are 2 types of entropy.
    1.thermodynamic
    2.Information

    The second law is about 1.

    I think it would be better to say that thermodynamic entropy is a subset of Shannon entropy. If you haven’t read Adami’s blog, he’s very good on explaining that thermodynamic entropy just IS a type of missing information.

  39. keiths: 1) If you always know (and can work with) the exact microstates, as Damon does, then you don’t need the concept of entropy.

    2) Even if you did, you could always choose to ‘forget’ the extra information and do the calculations from a position of ignorance.

    3) Damon doesn’t exist, so it doesn’t matter whether entropy is useful to him. It’s useful to us.

    We’re pretty close here. With 1, I’d say that what Damon calculates is entropy, but not thermodynamic entropy–because he knows too much. Not all variables are thermodynamic variables. And I agree with 3.

    2 is interesting. Let’s say we agree to limit the variables to those that are of interest for some particular inquiry. So we have the science that Einstein says is needed. Once we’ve done that, neither Yolanda nor Xavier will calculate the exactly right value. Damon CAN do so, but only if he, as you say “forgets” the irrelevant information available to him. And that is precisely the sense in which there is a correct value, but it’s not what any of those three calculates–unless Damon puts on his “hat of forgetfulness.”

  40. walto,

    I want to point out here that this is a very curious law, because there is, in fact, no proof for it. Really, there isn’t. Not every thermodynamics textbook is honest enough to point this out, but I have been taught this early on, because I learned Thermodynamics from the East-German edition of Landau and Lifshitz’s tome “Statistische Physik”, which is quite forthcoming about this (in the English translation):

    “At the present time, it is not certain whether the law of increase of entropy thus formulated can be derived from classical mechanics”

    This is from Adami’s explanation and why I am skeptical.

  41. walto,

    Damon CAN do so, but only if he, as you say “forgets” the irrelevant information available to him. And that is precisely the sense in which there is a correct value, but it’s not what any of those three calculates–unless Damon puts on his “hat of forgetfulness.”

    You were just telling us that the thermodynamic variables have to come from an approved list in order for entropy calculations to be valid. Now you’re saying that the calculations are invalid even when the variables do come from that list. The idea that entropy cannot actually be calculated by humans is a uniquely walto idea. Physicists and chemists do it all the time.

    As for Damon, there is no single “right” amount of forgetfulness.

    Suppose for the sake of argument that Walto’s List of Approved Thermodynamic Variables were actually binding. Xavier and Yolanda specify their macrostates using parameters from that list (temperature, volume, composition). They get different answers for the entropy.

    Jaynes understands why:

    Even at the purely phenomenological level, entropy is an anthropomorphic concept. For it is a property, not of the physical system, but of the particular experiments you or I choose to perform on it.

    You still don’t get it:

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    Jaynes is right. Xavier and Yolanda get different answers because they perform different experiments on the system. Yolanda determines isotopic concentrations and Xavier doesn’t.

    Xavier’s macrostate is correct. He sees a certain amount of X gas, at a certain temperature, evenly distributed throughout both halves of the chamber. The partition is removed and the macrostate remains unchanged. There is still that same amount of X gas, at the same temperature, evenly distributed throughout both halves of the chamber. The entropy of the system has not changed for Xavier.

    Yolanda’s macrostate is also correct. She sees a certain amount of pure isotope X0, at a certain temperature, in one half of the chamber, and the same amount of isotope X1 at the same temperature in the other half of the chamber. The partition is removed and the macrostate changes dramatically. Once equilibrium is reached, isotopes X0 and X1 are evenly distributed throughout both halves of the chamber. The entropy of the system has increased for Yolanda, though not for Xavier.

  42. walto,

    I think it would be better to say that thermodynamic entropy is a subset of Shannon entropy.

    This sounds reasonable but to measure thermodynamic entropy you need information that is more detailed then just the number of bits. Bekenstein uses bits to calculate the entropy of a black hole.

  43. colewd: Can you calculate the problem I gave Sal with Boltzman?

    No. But that only means that I cannot, it doesn’t mean that no one can. I’d never even heard of the shomate equation before.

  44. colewd: I think Mung may have nailed it. There are 2 types of entropy.
    1.thermodynamic
    2.Information

    The second law is about 1.

    And only applies to an isolated thermodynamic system.

  45. keiths: You were just telling us that the thermodynamic variables have to come from an approved list in order for entropy calculations to be valid. Now you’re saying that the calculations are invalid even when the variables do come from that list. The idea that entropy cannot actually be calculated by humans is a uniquely walto idea.

    As none of the propositions expressed there remotely resemble anything I’ve ever said, I stopped there.

  46. colewd: Bekenstein uses bits to calculate the entropy of a black hole.

    Which is nonsense, because entropy is a state function, and no one knows the thermodynamic state of a black hole, just as no one knows the state of the entire universe, or the universe when (if) it had a beginning, or of the universe when (if) it ends.

Leave a Reply