In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. walto:

    As I indicated above, to satisfy intuitions that the amount of entropy of a macrostate is both objective and not always zero, (and to also accommodate some of the problems with the dispersal view that have been pointed out on this thread, I think one has to take entropy as a mixed bag.

    fifth:

    Or (I think) you could assume the incarnation. An objective perspective that is also limited in knowledge.

    Theothermodynamics.

    Poor fifth. This is your brain on God, folks.

  2. Sal:

    “Energy dispersal” is a metaphor. It is not rigorous, it is a pedagogical qualitative description to help make sense of the Clausius definition.

    Wow. Just like walto, you are trying to distance yourself from dispersalism after having embraced it wholeheartedly earlier in the thread.

    Back then you said that Lambert was right and that entropy is a measure of energy dispersal. Now you’re saying that energy dispersal is just a metaphor for entropy, and a “non-rigorous” one at that!

    Sal, walto:

    Just to be absolutely clear: Do you (finally) agree with me that Lambert’s definition of entropy is incorrect, and that entropy is not a measure of energy dispersal?

  3. Sal, to walto:

    The most accurate and comprehensive definition of entropy is Boltzmann’s which came years after Clausius:

    S = k ln W

    No, Sal. The Boltzmann formula is neither the most accurate nor the most comprehensive equation for entropy. It applies only to ideal gases, in which the microstates are equiprobable and particle interactions are absent.

  4. Sal:

    Let Keiths try to calculate entropy change for a melting ice cube as was done on a basic chemistry exam as I provided from a real university…

    Keths, Mung, DNA_jock need to show how they apply their approaches and conception of entropy to a melting ice cube…

    I invite my detractors to calculate entropy change using Boltzmann’s definition and ignorance and information theory approaches.

    Sal,

    Get it through your head: the equations, and therefore the entropy calculations, are the same whether you are a disorderist, a dispersalist, or a missing informationist.

    The differences are differences of interpretation, not of calculation.

  5. Sal,

    There is an old saying, “if you want to understand entropy, calculate it.”

    That’s terrible advice. To understand entropy, you need to think about it as well as calculate it. Plugging numbers into an equation is no substitute for conceptual understanding.

  6. walto:

    Mung, I think that if we forget about quantum indeterminacy, and you believe all of the following:

    1. that entropy is largely a count of available microstates;
    2. that the “availability of microstates” is a probability distribution;
    3. that probabilities are “objective” in the sense of not being relative to anybody’s (or everybody’s) knowledge of facts; and
    4. that determinism is true;

    then the entropy of every macrostate will be zero. You will, that is, have described the world according to keiths’ Damon.

    The entropy of every macrostate would be zero only if every macrostate corresponded to one and only one microstate. For that to be true, the probability distribution function would have to have an infinite spike (the Dirac delta function) at the exact microstate and be zero elsewhere.

    As for the probabilities being non-epistemic, that could only happen if determinism were false, contrary to your assumption #4.

    keiths thinks that the problem of every macrostate having an objective entropy quantity of zero is solved by making the probabilities (or available microstates) relative to what somebody knows;

    No, because I don’t think there is such a problem to begin with. If the macrostate is, for instance, “1 mole of pure helium at 293 K in a square box with a volume of 1 cubic meter”, then there are zillions of compatible microstates and the entropy is nonzero. The only time the entropy is zero is when the macrostate specifies the exact microstate, as in Damon’s case.

    The point is that one can concede that Jock and keiths are correct that there are some problems with the dispersion view, and that the three of you are right that there are merits to looking at entropy as measure of available information without throwing the baby out with the bathwater.

    What baby? You’re seeing a lot of bathwater and assuming that there must be a baby in it somewhere. What exactly is lost by tossing out dispersalism and adopting the missing information view in its place?

    You are finally acknowledging that the dispersalist view has problems, which is good, but you still haven’t identified a problem with the missing information view, despite claiming that you could:

    keiths:

    Can you come up with a scenario in which the “entropy as missing information” interpretation fails?

    walto:

    If one doesn’t beg the question as you are wont to do, it fails for Damon when he considers my cat at room temp in my house.

    keiths:

    In what sense does it fail? Details, please.

    You never answered.

    The “disorder” and “energy dispersal” views are incorrect. The “missing information” view shows every sign of being correct, and no one has come up with even a single scenario in which it fails.

    Given the above, why not abandon the flawed views and adopt the robust one? It’s the rational thing to do.

  7. keiths: As for the probabilities being non-epistemic, that could only happen if determinism were false, contrary to your assumption #4.

    Why do you say that?

  8. keiths: The entropy of every macrostate would be zero only if every macrostate corresponded to one and only one microstate. For that to be true, the probability distribution function would have to have an infinite spike (the Dirac delta function) at the exact microstate and be zero elsewhere.

    No, because I don’t think there is such a problem to begin with. If the macrostate is, for instance, “1 mole of pure helium at 293 K in a square box with a volume of 1 cubic meter”, then there are zillions of compatible microstates and the entropy is nonzero. The only time the entropy is zero is when the macrostate specifies the exact microstate, as in Damon’s case.

    Nothing new there, just a repetition of what you’ve said maybe fifty times already on this thread. And there’s nothing anyone here is likely to disagree with either….except, of course, your view that there’s “no problem there begin with.” As I’ve said many times, I think it IS a problem that Damon’s calculation of the entropy of my cat at room temperature and every other item in the universe, from the big bang until heat death is zero. That seems to me a reductio.

  9. The famous equation by Clausius to describe entropy was not even based on modern notions like atoms, much less the idea of microstates introduced later by Boltzmann, Gibbs and others. Clausius was the pioneer of the 2nd law, and his version of entropy is related to the second law, and it was deduced through a completely independent line of reasoning than Boltzmann’s. Boltzann’s version of entropy, strictly speaking can be formulated independent of the 2nd law. In fact there is some discussion if the 2nd law can be derived from Boltzmann’s statistical mechanics, and it can with some provision.

    Clausius version does not have microstates in the definition of entropy. And as noted earlier, Clausius version indirectly defines entropy through “change in entropy”, it does not formally provide “absolute entropy” of a system. One can compute entropy in many cases just by recording thermometer readings and being able to make heat transfer measurements. Counting microstates of substances in the liquid state could be a nightmare.

    That’s one reason Keiths and DNA_Jock can only hand wave and just whine, they can’t actually translate their observer-dependent ignorance approaches to practical problems where the only data provided are things like thermometer readings and heat transfer (dispersal) amounts.

    In contrast, I’ve demonstrated the superiority of being able to take temperature and heat (energy dispersal) data and compute entropy.

    Clausius’ equation for entropy appeared in a in 1865 a few years before Boltzmanns treatise in 1871. In fact, it was Clausius who coined the word “entropy”!

    Clausius defines entropy purely in terms of energy and temperature. In the case of things like melting ice, the Entropy change is a piece of cake:

    delta-S = delta-Q / T

    where
    delta-S = change in entropy
    delta-Q = energy input
    T = Temperature

    Trying to compute entropy change via Boltzmann’s

    S = kB ln W

    is nightmarish for melting ice. I don’t recall I tried it.

    So the amusing thing is that entropy (or at least entropy change) can be calculated without even recourse to Boltzmann, hence no recourse to information theory! Clausius didn’t use information theory.

    If one has a thermometer and a means of measuring the amount of heat energy added to a system, one can calculate entropy in many practical contexts. This proceeds from the Clausius integral:

    \Delta S=\int \frac{\delta Q_{rev}}{T}

    One can see, that if the temperature is held constant during a process, it is easy to calculate the entropy.

    If for example, we were to isothermally (at the same temperature) reverse the free expansion of gas in our earlier example (below) by compressing the gas in the two chambers back to one chamber (the original sate), we would generate heat! We would have to exert some energy to compress the gas, and the entropy of expansion which we calculated in repeated examples in this thread can be used to calculate the energy required to reverse the process of expanding gas by compressing it.

    See:

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    The entropy of expansion for my example was:
    5.763 J/K

    Now, if we compress the gas back from 2 chambers to 1 the entropy of compression would be the same magnitude but a sign change:

    -5.763 J/K

    This means we have keep pumping heat out of the compression process to keep the temperature the same. We can actually calculate the amount of heat pumped out if we know the temperature. For the isothermal case

    delta-S = delta-Q / T

    therefore

    delta-Q = delta-S x T

    If the temperature was 100 Kelvin

    delta-Q = (-5.763 J/K) x 100 = 576.3 J or 576.3 Watt seconds

    This is also how much work it takes to compress the gas back to the original state after the free expansion depicted below if we keep temperature at 100K. We would need a motor/pump exerting at least that much energy if the process is isothermal.

    Btw, the clausius integral suggests, if the compression is done at a cooler temperature, we exert less work. Hence, this is one of the reasons air conditioners are more energy efficient in moderate temperatures versus hot temperatures.

    The bottom line is, for many practical applications we can and most likely will forego doing analysis using Boltzmann’s entropy definition, but rather employ Clausius because of practicality. Hence energy dispersal is more than adequate as a “dumbed-down” qualitative definition of entropy.

  10. keiths:

    As for the probabilities being non-epistemic, that could only happen if determinism were false, contrary to your assumption #4.

    walto:

    Why do you say that?

    A system is in exactly one microstate at a time. If determinism were true, the microstate of an isolated system at time t would determine the microstates at every future time t + Δt. Thus, the metaphysical probability of seeing a particular microstate at a particular time would be either one or zero. It would be all or nothing, with no in-between.

    The distributions used in thermodynamics, of course, are not like that.

    There are only two ways to get fractional probabilities:

    1) You could use epistemic probabilities instead of metaphysical probabilities. Problem is, you’ve already ruled that out. You want the probabilities to be observer-independent.

    2) You could give up assumption #4 and allow for the non-deterministic evolution of microstates. The metaphysical probability of future microstates could then become fractional.

    Even that wouldn’t be very satisfying, though, because what you really want is for there to be more than one currently possible microstate. After all, you’re trying to get the current entropy to be nonzero. That can’t happen with metaphysical probabilities, even if determinism is false. The system is currently in some particular microstate with a metaphysical probability of 1, even if we couldn’t predict that microstate a moment ago.

    You really need to go with epistemic probabilities, determinism or no determinism.

    Your #3…

    3. that probabilities are “objective” in the sense of not being relative to anybody’s (or everybody’s) knowledge of facts;

    …can’t be satisfied unless you adopt an all-or-nothing probability distribution, as I described earlier:

    For that to be true, the probability distribution function would have to have an infinite spike (the Dirac delta function) at the exact microstate and be zero elsewhere.

    And if you adopt such a distribution, then you’re back to zero entropy for all systems in equilibrium.

    By demanding that entropy be observer-independent, you force it to be zero, rendering entropy comparisons useless.

    The missing information view doesn’t have that problem, because it allows entropy to be observer-dependent. Observer-dependence in turn allows for nonzero values of entropy.

  11. walto,

    You keep saying that if Damon sees entropy values of zero, that this is somehow a problem for the missing information interpretation of entropy. Why?

    I keep asking for specifics, and you never offer any.

    What exactly goes wrong if we take the MI view, which allows a super-observer like Damon to calculate entropy values of zero?

    It doesn’t prevent us from seeing nonzero entropy values. Everything goes on just as before.

    So what’s the problem?

  12. If the temperature was 100 Kelvin

    delta-Q = (-5.763 J/K) x 100 = 576.3 J or 576.3 Watt seconds

    It is noteworthy that even though the entropy change is always -5.763 J/K for all temperatures (where the ideal gas approximation holds), the work in reversing expansion by compression is not. I we carried out the same process at higher temperature.

    delta-Q = (-5.763 J/K) x 1000 = 5763 J or 5763 Watt seconds

    It takes 10 times more work to do the compression if the temperature was 10 times higher.

  13. Sal,

    That’s one reason Keiths and DNA_Jock can only hand wave and just whine, they can’t actually translate their observer-dependent ignorance approaches to practical problems where the only data provided are things like thermometer readings and heat transfer (dispersal) amounts.

    Sure we can. All the entropy-related equations that are available to you are available to us, including the Clausius equation.

    When will that finally sink in? The difference between dispersalism and the missing information view is a difference in the interpretation of entropy, not in its calculation.

  14. Let Keiths describe the change of entropy in a melting cube of ice without reference to the energy dispersed into the ice, let him give formal equations in terms of how he counts the change in microstates without reference to the energy dispersed into the ice cube.

    Keiths said the energy dispersal metaphor doesn’t work, so now he must have had an alternative method in mind that doesn’t involve the energy dispersed into the ice cube.

    Now if you just use the energy dispersed into the ice cube of 6660 Joules just divide by the melting temperature, we get:

    6660 J / 273 K = 24.4 J/K

    But that’s cheating, since you said energy dispersal is a misconception of entropy. Show for the readers using methods of observer-depedent ignorance that the change in entropy for a melting 20 gram ice cube is 24.4 J/K.

    If you start having to use 6660 Joules of energy dispersed into the ice cube to compute your ignorance figures, then shame on you Keiths, because you said energy dispersal is a misconception of entropy.

    You dig your own pit, now lie in it.

    So Keiths, are you and DNA_jock going to need the amount of energy dispersed into the ice cube to calculate your observer-dependent ignorance numbers? Is it beneath you to admit how essential the amount of energy dispersed from the environment into the ice cube is toward the computation of entropy? Too funny.

  15. Well, that’s odd. Aren’t you supposed to have me on ‘ignore’, Sal?

    By bailing out, you got yourself out of addressing my six points against dispersalism.

    Now you’ll have to face them again.

  16. Sal,

    Keiths said the energy dispersal metaphor doesn’t work, so now he must have had an alternative method in mind that doesn’t involve the energy dispersed into the ice cube.

    I addressed this a week ago, Sal:

    Second point. I claim that entropy is a measure of the missing information regarding a system’s microstate, not a measure of energy dispersal.

    Sal inanely takes this to mean that I’m not allowed to use energy in my calculations:

    Vindicate yourself, Keiths, show the readers a derivation with your energy-free analysis of entropy. Work the results out like you think a college chem student should work them out.

    That’s ridiculous, of course. Energy is central to thermodynamics, and so of course anyone doing thermo will take energy into account. I didn’t say that energy had no place in thermo; I said that entropy isn’t a measure of energy dispersal.

    It’s as if I were asserting that a car’s range is a measure of distance, in units of miles, with Sal retorting “Oh, yeah? Then you should be able to calculate the range without taking the gas tank capacity and mpg into account.”

    It Does Not Follow.

  17. To illustrate the simplicity of the Clausius formulation, let me use a numerical integration approximation to an exam question (with answers provided):

    http://www.sfu.ca/~mxchen/phys3441121/Mt1Solution.pdf

    QUESTION:
    What is the change in entropy of 20 grams liquid water being heated from 273K to 300K?

    ANSWER:
    The answer using the analytical solution to the Clausius integral is 7.9 J/K (see the exam answer in the link above).

    But let me just do this numerically. To this I note that to raise 20 grams of liquid water 1 degree kelvin I have to add 83.72 Joules of energy (see the exam solution to see why).

    Now, I can approximate the change in entropy when I add 83.72 Joule of energy when water is at 273K kelvin as:

    delta-S ~= 83.72 J/ (273+1)K = 83.72 J/ 274K = 0.3055 J/K

    Now that isn’t exact since adding heat raises the temperature! But the approximation isn’t awful either. I can do this procedure for every degree increase after adding 83.72 Joules of energy as I add heat to raise the temperature from 273K to 300K.

    I can then accumulate how much entropy approximately in total I’ve added as I add 2260 J (83.72 x 27) of energy to the water.

    So I get an entropy change table that looks like the one below at the temperatures from 273K to 300K, where the left hand column is the temperature, and the right hand column is the incremental amount of entropy added to the water for 1 degree of temperature increase after adding 83.72 Joules of heat.


    273 0
    274 0.306
    275 0.304
    276 0.303
    277 0.302
    278 0.301
    279 0.300
    280 0.299
    281 0.298
    282 0.297
    283 0.296
    284 0.295
    285 0.294
    286 0.293
    287 0.292
    288 0.291
    289 0.290
    290 0.289
    291 0.288
    292 0.287
    293 0.286
    294 0.285
    295 0.284
    296 0.283
    297 0.282
    298 0.281
    299 0.280
    300 0.279

    The sum total the right column rounds to 7.9 J/K, the same answer as in the exam. Tada!

    See, it’s not so hard to compute entropy change in this case if we just account for the energy dispersed in or out of a system.

    Keiths and DNA_Jock, on the other can’t use the energy dispersed into the ice to compute entropy since they say entropy isn’t about energy dispersal into the ice, it’s about their observer dependent ignorance of whatever.

  18. Repeating this for Sal’s benefit:

    Sal:

    “Energy dispersal” is a metaphor. It is not rigorous, it is a pedagogical qualitative description to help make sense of the Clausius definition.

    Wow. Just like walto, you are trying to distance yourself from dispersalism after having embraced it wholeheartedly earlier in the thread.

    Back then you said that Lambert was right and that entropy is a measure of energy dispersal. Now you’re saying that energy dispersal is just a metaphor for entropy, and a “non-rigorous” one at that!

    Sal, walto:

    Just to be absolutely clear: Do you (finally) agree with me that Lambert’s definition of entropy is incorrect, and that entropy is not a measure of energy dispersal?

  19. keiths: A system is in exactly one microstate at a time. If determinism were true, the microstate of an isolated system at time t would determine the microstates at every future time t + Δt. Thus, the metaphysical probability of seeing a particular microstate at a particular time would be either one or zero. It would be all or nothing, with no in-between.

    The distributions used in thermodynamics, of course, are not like that.

    There are only two ways to get fractional probabilities:

    1) You could use epistemic probabilities instead of metaphysical probabilities. Problem is, you’ve already ruled that out. You want the probabilities to be observer-independent.

    2) You could give up assumption #4 and allow for the non-deterministic evolution of microstates. The metaphysical probability of future microstates could then become fractional.

    Precisely. Non-epistemic probabilities under determinism are 1 or 0. That is what I meant, and is what I think constitutes a problem for any view according to which entropy is nothing but a quantification of lack of information. Where there is complete information, entropy will always be zero. And, as Sal notes above, that’s not what Clausius was talking about–even if it is relevant.

    I also don’t think it’s quite right when you say in another post above. “Oh, we all use the same equations–it’s just the interpretations that are in question.” Yes and no. One or more of your six points involves a complaint that a dispersalist CAN’T use the equations because the terms in them are wrong. Sal above says Clausius CAN’T be talking about information, for similar reasons. I think you’re both right about that. When everybody “uses” the equations, my sense is that there will be sleight of hand, if one is a “pure informationalist” or a ‘pure disperado.’

    Sal and Jock have seemed to come to a meeting of the minds that calling entropy a measure of energy dispersal is an over-simplification, “a dumbing down”. All that is necessary for everybody to agree, I think, is for you to come to a similar conclusion that calling entropy a measure of missing information is also an over-simplification. I believe there’s truth in both views but they must be combined for the reasons that both you and Sal have urged.

    Time to quit fighting, agree, have pbjs and then go home.

  20. keiths: Do you (finally) agree with me that Lambert’s definition of entropy is incorrect, and that entropy is not a measure of energy dispersal?

    Do YOU finally agree with Lambert that it is not simply a measure of paucity of information?

  21. stcordova: 7.9 J/K

    I wonder what the measure in J/Ks was yesterday afternoon that caused my daughter to have so much trouble on an exam in school. Now THERE, being Damon would have given her all the right answers.

  22. DNA_Jock: So, Sal, we really agree that if you actually want to understand entropy, then you need Boltzmann. If you merely want to plug an entropy term into an engineering calculation, then you can use whichever equations are easiest to manipulate (so long as your understanding is sufficient to spot when those equations might lead you astray…).

    Are you ready to agree that you need Clausius too? Do you think that an all-knowing observer correctly calculates every macrostate–open or closed–as zero? Do you think it’s a sufficiently precise definition of “entropy” to call it “missing information”?

  23. Btw, I’ve seen one info theory backer claim that the entropy of a racked batch of billiard balls is lower than the entropy of the set of balls after they’ve been broken. They’re like eggs, I guess he’s saying.

    Based on the paper Joe linked to, Lambert obviously disagrees. And it seems like no more than a metaphor to me, like using the balls to make a model representing molecules, or even atoms. What say y’all?

  24. walto: Are you ready to agree that you need Clausius too?

    No.

    Do you think that an all-knowing observer correctly calculates every macrostate–open or closed–as zero?

    Yes. But you need to pay attention to that observer’s RAM requirements…

    Do you think it’s a sufficiently precise definition of “entropy” to call it “missing information”?

    I would prefer minus rho ln(rho).

  25. The above graphic is provided courtesy of Sal, to demonstrate that Mike Elzinga and olegt led a horse to water in 2012, but none of it sank in.
    This is Sal’s plot of temperature against added energy for Mike’s concept test. After Mike had spoon-fed him the answers. It’s a little problematic for a “dispersalist” view of entropy, or for blindly plugging a “heat capacity” into an equation…
    Interesting that Sal trotted out the Sackur-Tetrode equation back in 2012, and Mike pointed out its shortcomings.
    Same old same old.

  26. DNA_Jock: It’s a little problematic for a “dispersalist” view of entropy, or for blindly plugging a “heat capacity” into an equation…

    Can you explain why it’s not equally problematic for an information deficit def? That’s the part I don’t get.

  27. walto: Can you explain why it’s not equally problematic for an information deficit def? That’s the part I don’t get.

    In Mike’s system, once the energy goes above 8, the entropy goes down as you (to use Sal’s quaint term) “disperse” more energy into it.
    Hence the negative temperatures.
    I don’t see how it is problematic in any way for the information deficit definition. At E=16, you know the microstate.

  28. keiths: That should give you pause.

    hi keiths,

    Go back and carefully read my question, and your answer, and see if you can spot the flaw. Either my question is misguided, or your answer is.

    thanks

  29. keiths: Plugging numbers into an equation is no substitute for conceptual understanding.

    If it were I’d be totally screwed.

  30. stcordova: That’s one reason Keiths and DNA_Jock can only hand wave and just whine…

    I do wish they would stop whining. The sound is incredibly annoying.

  31. stcordova: …where the left hand column is the temperature, and the right hand column is the incremental amount of entropy added to the water…

    Adding entropy to water is what makes if fizz!

  32. Mung,

    Go back and carefully read my question, and your answer, and see if you can spot the flaw. Either my question is misguided, or your answer is.

    If you think there’s a flaw, then describe it and I will comment.

  33. walto: And it seems like no more than a metaphor to me, like using the balls to make a model representing molecules, or even atoms. What say y’all?

    I think it’s nonsense to speak of the entropy of a racked set of billiard balls. But I bet Salvador could do the calculation!

  34. keiths: If you think there’s a flaw, then describe it and I will comment.

    My question had the following assumptions.

    distinguishable – irreversible
    indistinguishable – reversible

    Your answer had just the opposite:

    indistinguishable – irreversible
    distinguishable – reversible

    We can’t both be right. And I assume your conclusion was supposed to follow from what you wrote, and if what you wrote was wrong, your conclusion doesn’t follow.

    That should give you pause…

  35. walto: But damon knows it at E=x, for ALL x, no?

    Asked and answered, viz:

    Yes. But you need to pay attention to that observer’s RAM requirements…

    When he runs out of RAM, he’s gonna start erasing data…

  36. Mung,

    Your answer had just the opposite:

    indistinguishable – irreversible
    distinguishable – reversible

    No, my answer was that the spontaneous mixing of indistinguishable gases isn’t even a process to begin with, because the macrostate doesn’t change. And I certainly didn’t claim that when the gases are distinguishable, the process is reversible. It isn’t.

    Here’s the exchange again:

    Mung:

    Why is it that the mixing of two different gases, with a positive change in entropy, is considered an irreversible process, whereas a “mixing” of the same gas, in which the entropy of the system does not change, is considered a reversible process?

    keiths:

    Reversibility and irreversibility are defined with respect to macrostates, so the isentropic mixing of indistinguishable gases really isn’t a process at all. The beginning and final macrostates are the same. There’s nothing to reverse.

    There is a change in macrostate when the gases are distinguishable.

    That should give you pause. Distinguishability is observer-dependent, as the Xavier/Yolanda/Damon experiment shows. If distinguishability is observer-dependent, then entropy is observer-dependent.

    So in one case we have a spontaneous, irreversible process — the mixing of two distinguishable gases, in which entropy increases — and in the other, we have a non-process in which entropy remains unchanged.

    Distinguishability makes all the difference. And since distinguishability is observer-dependent, so is entropy.

  37. DNA_Jock,

    It seems like that that limitation is a constraint on any pure information-paucity analysis, but maybe that’s wrong. A friend tells me there’s a recent article in New Scientist on just this issue; maybe it will be helpful to me. I will try to find it, see if I can understand it, and report back.

  38. walto,

    Precisely. Non-epistemic probabilities under determinism are 1 or 0. That is what I meant, and is what I think constitutes a problem for any view according to which entropy is nothing but a quantification of lack of information.

    It isn’t a problem unless you insist on non-epistemic probabilities.

    Here’s what’s happening:

    1. You’re taking the MI view, which works fine.
    2. You’re modifying it by stipulating that the probabilities must be non-epistemic.
    3. You observe that after the modification, the entropies are all zero.
    4. That isn’t desirable, so you reject the MI view.

    That’s silly. The problem isn’t with the MI view, it’s with your insistence on non-epistemic probabilities. If it ain’t broke, don’t “fix” it with unnecessary stipulations that render it useless.

    Where there is complete information, entropy will always be zero.

    Yes, and that’s true with or without determinism, and with or without epistemic probabilities. Damon has complete information, so he sees entropies of zero. We don’t have complete information, and so for us the entropies are the same as they always were.

    What’s the problem?

    And, as Sal notes above, that’s not what Clausius was talking about–even if it is relevant.

    That’s because Clausius didn’t have a theory. Boltzmann provided one.

    It’s amusing that Sal is trying to distance the Clausius equation from the Boltzmann equation. Earlier in the thread, he was doing the exact opposite:

    The magic was connecting Clausius definition with Boltzmann’s definition. That discovery by Boltzmann was like what E=mc^2 was to Einstein. I related the Clausius version to the Boltzmann version here:

    http://creationevolutionuniversity.com/forum/viewtopic.php?f=4&t=72

    walto:

    I also don’t think it’s quite right when you say in another post above. “Oh, we all use the same equations–it’s just the interpretations that are in question.” Yes and no. One or more of your six points involves a complaint that a dispersalist CAN’T use the equations because the terms in them are wrong.

    No, not at all. Dispersalists can use the same equations, and the quantities they produce are entropies. The problem is with the interpretation of those calculations. Dispersalists claim that those entropies are a measure of energy dispersal, but they can’t be, because the units are wrong. The units are correct for entropy, but not for energy dispersal.

    Sal above says Clausius CAN’T be talking about information, for similar reasons.

    Clausius was talking about energy and temperature. Boltzmann supplied the theory — statistical mechanics — that connected those with what’s happening at the microscopic level. That’s where (the lack of) information comes in, because we don’t know the exact microstate.

    Sal and Jock have seemed to come to a meeting of the minds that calling entropy a measure of energy dispersal is an over-simplification, “a dumbing down”. All that is necessary for everybody to agree, I think, is for you to come to a similar conclusion that calling entropy a measure of missing information is also an over-simplification.

    It isn’t an oversimplification. The MI view works, and no one has pointed to a scenario in which it fails.

    The ‘disorder’ interpretation fails. The ‘energy dispersal’ interpretation fails. The ‘missing information’ interpretation succeeds.

    Isn’t the choice obvious?

  39. keiths:

    And since distinguishability is observer-dependent, so is entropy.

    Mung:

    That’s a non sequitur.

    No, it follows inevitably from what we’ve been discussing.

    1. There was a change in entropy because of the change in macrostate.

    2. There was a change in macrostate because the gases were distinguishable. Had they been indistinguishable, there would have been no change in macrostate and therefore no change in entropy.

    3. Distinguishability is observer-dependent. (See this.)

    4. Therefore, entropy is observer-dependent.

  40. keiths: The problem isn’t with the MI view, it’s with your insistence on non-epistemic probabilities.

    I’m not insisting on that, mung was. As indicated, I don’t think non-epistemic probabilities are terribly useful–unless we constrain them in some way or ignore various laws. So I have no problem with using epistemic probabilities–the probability of some event occurring given some particular batch of prior information.

    But again, the calculations using epistemic probabilities also seem to me to need to be constrained to be useful for Sal’s or Lambert’s purposes. What do you think about the constraint regarding RAM that Jock suggests above?

  41. keiths: No, my answer was that the spontaneous mixing of indistinguishable gases isn’t even a process to begin with, because the macrostate doesn’t change. … So in one case we have a spontaneous, irreversible process — the mixing of two distinguishable gases, in which entropy increases — and in the other, we have a non-process in which entropy remains unchanged.

    You are claiming that “the spontaneous mixing of indistinguishable gases isn’t even a process to begin with,” and it is therefore not a reversible process. Is that because there is no “mixing”?

    I claimed that a “mixing” of the same gas, in which the entropy of the system does not change, is considered a reversible process.

    You’re saying I’m wrong about that?

  42. keiths: 1. There was a change in entropy because of the change in macrostate.

    2. There was a change in macrostate because the gases were distinguishable. Had they been indistinguishable, there would have been no change in macrostate and therefore no change in entropy.

    3. Distinguishability is observer-dependent. (See this.)

    4. Therefore, entropy is observer-dependent.

    That’s it in a nutshell.

  43. Walto,

    Nice to hear from you. I have said from the start, energy dispersal is a dumbed down pedagogical tool, but I’ve argued, by referencing many chemistry texts, it’s more than good enough for chemistry students to do what they need, and if I had that when I was learning entropy, I’d have been far better off.

    The Clausius and Boltzmann versions are equivalent under most circumstances, there is some debate about areas where Boltzmann is far more comprehensive or may be describing something Clausius can’t.

    On the other hand, the Clausius version dominates engineering and chemical practice. It was formulated about a century before the codification of information theory by Shannon and published 6 years before Boltzmann who strictly speaking wasn’t even the guy who provided the form eventually stated by Planck as:

    S = k ln W

    Clausius is good enough for most engineering and chemical practice. It does not require information theory as I’ve tried to show over and over. A thermometer and calorimeter or a force meter (scale) often suffices to make calculations.

    I was showing circumstances where there is no need of information theory whatsoever to calculate entropy (entropy change) using Clausius.

    We could try to do Boltzmann for ice cubes and heating water, and hence we (in an indirect way) go to information theory, but this is like swatting a fly with a Buldozer.

    I can work out some information theory approaches to entropy and provide calculations here for a very very few limited examples. But beyond things like ideal gases, once we get to liquids, it can be nightmarish to calculate absolute entropy. I don’t recall I ever did much with entropy of liquids using Boltzmann (information theory, microstates), we did it for Clausius (thermometers and calorimeters).

    Now, this discussion all began way back with IDists using Boltzmann to justify ID. I’m negative on that approach, but it didn’t hurt to become versant in the arguments so I could satisfy my own curiosity. I’ve been highly critical of my ID colleagues, in other words.

    Keiths said he can use all the equations with his information theory approach, but I pointed out, many of those equations have terms for ENERGY dispersal embedded in them because much of the world is seen through the lens of energy, and many of our thermodynamic instruments are oriented toward energy measurements of some sort (like thermometers), not measuring microsates or information.

  44. stcordova,

    That puts my concerns, nicely, Sal. Thanks. keiths wants this to be a battle between him and some champion of undiluted dispersalism. I’m not sure why he insists on framing it that way, but, as indicated above, it does make one suspicious.

    And that there’s any “heat” over this matter seems funny to me. Lambert’s unpleasantness with Ben-Naim (if correctly reported) is, to me, both weird and sad. But maybe there were reputations at stake there, at least. Here, I have no idea at all what fuels it.

    (Well, I do, but it’s not nice to say.)

  45. stcordova,

    Keiths said he can use all the equations with his information theory approach, but I pointed out, many of those equations have terms for ENERGY dispersal embedded in them because much of the world is seen through the lens of energy, and many of our thermodynamic instruments are oriented toward energy measurements of some sort (like thermometers), not measuring micro sates or information.

    I really think you have a great handle on the issues here. What’s interesting is the Physicists like Berkenstein, Penrose and Hawking used the Shannon/Boltzman information to support their theories of cosmology. Without real experimental backup their conclusions become very suspect. An example is Berkenstein’s calculation of the entropy of a black hole.

  46. walto,

    That puts my concerns, nicely, Sal. Thanks. keiths wants this to be a battle between him and some champion of undiluted dispersalism.

    I’m arguing against what you have been saying throughout the thread, walto. I already gave you an example:

    His [Damon’s] mistake is your mistake, that of thinking that by “entropy” we were looking for a measurement of ignorance rather than of dispersal.

    Take responsibility for what you write.

    That goes for you, too, Sal.

  47. keiths: I’m arguing against what you have been saying throughout the thread, walto. I already gave you an example:

    Throughout the thread? Really? I probably overstated my point in that single sentence you put there, but I’ve clarified my position a couple of dozen times since then. You don’t care, though, right?

Leave a Reply