In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. keiths,

    You are confusing the representation of entropy with the representation of the microstate. They are not the same, by any stretch of the imagination.

    Can you explain?

  2. colewd: Can you show me how to estimate the probability of a given micro state?

    No, but that is not how it works, so it’s no big deal.

    We are talking a very large number of molecules and possible states and need to think in statistical terms, such as not the probability of a specific microstate but the probability of a given distribution.

    In the dice analogy we’re not interested in the specific state 2-2, 1-3, or 3-1, we’re interested in the distribution.

  3. colewd:

    If we look at this analysis you see that based on Bekenstein’s computation of black hole (of one solar mass) entropy it is 20 orders of magnitude larger than the sun. This does not seem logical at all. Thoughts?

    Hi,

    I really don’t know much about Bekenstein-Hawking entropy than what I’ve posted.

    So, in answer to that specific question, I don’t know, sorry.

  4. Thanks. Can you show me how to estimate the probability of a given micro state?

    Since Mung might not be of much help, let me at least provide something regarding thermodynamic entropy.

    For THERMODYNAMIC microstates, the assumption is that every microstate is equally probable (equiprobable). This is the fundamental postulate of statistical mechanics:

    https://ocw.mit.edu/courses/materials-science-and-engineering/3-012-fundamentals-of-materials-science-fall-2005/lecture-notes/lec21t.pdf

    This postulate is often called the principle of equal a priori probabilities. It says that if the microstates have the same energy, volume, and number of particles, then they occur with equal frequency in the ensemble.

    which is pretty much saying every microstate is as probable as the next. Which means if we have W microstates, the probability of any given microstate is

    P(any_given_microstate) = 1/ W

    That’s why we can say stuff like

    I = Shannon_information = -log2 [P (any_given_microstate)] =

    -log2 (1/W) = log2 (W)

    or

    S = kB ln W

    What makes a thermodynamic entropy different from other Shannon-type entropies is the energy dispersal parameters like the macrostate properties of:

    Temperature (or Internal Energy)
    Heat input/output
    Number of Particles
    Volume

    The information theoretic description that Keiths swears by are imho mostly gratuitious, superfluous add-ons that provide little in the way of meaningful insight.

    If Shannon entropy of equiprobable microstates is:

    I = log2 (W)

    where W is anything we can count, like the number of possible beans in a jar, then to paraphrase Einstein, the entropy formula is void of meaning, it’s just as logarithm of some number W, where W is any number we choose!

    Thermodynamic entropy based on thermodynamic microstates has a fairly well defined meaning.

    If you look at cell F33 in this spreadsheet:

    http://creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    take the number in that cell and raise e to that power, and that is the number of microstates.

    FYI, the number of microstate is such a large number, my spreadsheet blows up! I enter a formula something like:

    =exp(F33)

    and it will blow up.

    But the point is the number of microstates for thermodynamics is well defined and each microstate is equiprobable. Mung and Keiths can’t in general calculate their entropies with their information theoretic approaches since I pointed out they take a backward approach by defining thermodynamic entropy in terms of information rather than defining their thermodynamic information in terms of energy dispersal.

    Neither Mung, Keiths, nor DNA_jock can calculate the change in entropy of a 20 gram melting ice cube because their approach is backward starting with superfluous, gratuitous mostly useless understanding of thermodynamic entropy.

    The proper approach in most cases is to start with Claussius entropy, from which one can then compute Boltzmann entropy, which then can be used to calculate Shannon entropy.

    Keiths, Mung, and DNA_jock attempt to start with Shannon entropy and then compute Boltzmann entropy, but they never get anywhere for something as simple as a melting ice cube, which should be trivial enough to compute for a college chemistry exam!

    Change of entropy of 20 gram ice cube melting.

    Heat energy needed to melt 20 gram ice cube: 6660 J
    Temperature of melting : 273 K

    Entropy change (Clausius): 6660 J / 273 K = 24.39 J/K

    therefore

    Entropy change (Boltzmann): 24.39 J/K

    therefore

    Entropy change (Shannon) : 24.39 J/K / (1.3806 x 10^-23) / ln (2)
    = 2.549 x 10^24 bits

    But Keiths, Mung and DNA_Jock can’t compute this because they insist they can somehow compute the energy dispersal parameters of heat and temperature purely from information theory.

    That’s because unless one defines how W is computed, then

    S = kB ln W

    is meaningless. If one defines W in terms of energy dispersal, then S becomes meaningful. Keiths approach is mostly devoid of meaning because he tries to avoids defining thermodynamic microstates apart from the energy dispersal parameters of:

    Temperature (or internal energy)
    Heat input into the system
    Volume
    Number of particles

    In the case of melting ice cube, we assume no change in volume by re-drawing the thermodynamic boundaries as the ice melts (and has a slight decrease in volume), therefore there is no effective change in volume. The mass of the ice cube indirectly gives us the number of particles. The melting point gives us the temperature, and the heat of melting per mole the amount of heat needed. Hence:

    Heat energy needed to melt 20 gram ice cube: 6660 J
    Temperature of melting : 273 K

    Entropy change (Clausius): 6660 J / 273 K = 24.39 J/K

    But that cannot be done if one starts arguing “entropy is our ignorance of the system details”. That’s too vague to be of much use to science. The Claussius definition has stood the test of time because it is nicely defined as illustrated by the case of a melting ice cube.

    I mean, “ignorance of the details of the system” mean we need to factor in our ignorance about the day of the week it is when the ice cube melted. Absurdity!

  5. colewd to Mung:

    Thanks. Can you show me how to estimate the probability of a given micro state?

    Not directly since Mung and Keiths appeal to non-existent microstate meters and microstate counters and Shannon-entropy meters.

    If they bothered to start with thermometers and calorimeters first, then maybe they be able to get to first base, but since they hate defining entropy in terms of the claussius integral (energy dispersal), they can’t give much in the way of direct answers, only vague hand waving.

  6. colewd,

    Sal is still cowering behind the ‘ignore’ button, afraid to come out. I think it’s time to put him out of his misery.

    How about finally quoting my comment to him?

  7. stcordova: they can’t give much in the way of direct answers, only vague hand waving.

    I’ve read YoungCosmos and know that’s never been a concern for you personally.

  8. keiths:

    You are confusing the representation of entropy with the representation of the microstate. They are not the same, by any stretch of the imagination.

    colewd:

    Can you explain?

    Here’s an analogy:

    You’re trying to locate a buried treasure. The treasure is at some exact, unknown location. That’s equivalent to the microstate.

    All you know is that the treasure is within a five-mile radius of the oak tree in Old Man Brummer’s pasture. That’s equivalent to the macrostate. It’s a summary of what you actually know about the exact microstate.

    The entropy is the difference between the two — how much extra information it would take to narrow down the location of the treasure to a particular point.

    Note that you can express the entropy without giving away either the macrostate or the microstate. A friend of yours knows that you’re searching for the treasure and asks about your progress. You can say “I’ve located it to within a radius of five miles” without adding the part about Old Man Brummer’s oak tree. The entropy is just a measure of the uncertainty — the amount of missing information.

  9. keiths: The entropy is the difference between the two — how much extra information it would take to narrow down the location of the treasure to a particular point.

    Right. So how much would it take, exactly? Do the calculation for us.

  10. walto,

    Right. So how much would it take, exactly? Do the calculation for us.

    What’s the point? It’s an analogy.

  11. colewd:

    Can you show me how to estimate the probability of a given micro state?

    As mentioned earlier, the microstate are never measured directly, they are INFERRED from the energy dispersal data.

    As an illustration, if we use the Sakur-Tetrode approximation for helium:
    http://creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    If I had one helium atom at a temperature of 300K, we can compute the number of microstates. The number I got using the result in cell F33 is

    W = e ^ (73.63)

    This means we have e^73.63 microstates where each microstate has 6-dimensional space and momentum coordinates of

    (X, Y, Z, P_x, P_y, P_z)

    The particle can hypothetically be in one of these microstates.

    The question arises, how can we have a finite number of mircrostates when strictly speaking if we had infinite precision past the decimal point, there should be an infinite number of micrsotates. The answer is a conceptually imposed rounding, or quantization:

    https://en.wikipedia.org/wiki/Quantization_(signal_processing)

    and

    https://en.wikipedia.org/wiki/Quantization_(physics)

    The Boltzmann constant was empirically measured and created the “right” amount of quantization (rounding) to give the right number of microstates.

    Here is a blurb how the “right” rounding (quantization) was arrived at:
    http://phys.org/news/2013-07-accurate-boltzmann-constant.html

    As I said, energy dispersal parameters can be used for calculating entropy, here are the parameters that I put in the Sakur-Tetrode approximation of helium to calculate the entropy of a particle:

    Number of particles: 1 mol/ Avogadro’s number = 1 helium atom
    Input heat: 0 J
    Temperature: 300 K
    Volume : 1 cubic meter

    We don’t really have a microstate or Shannon entropy meter. The number of micrsotates W is usually inferred, it is not directly measured. That’s why:

    S = kB ln W

    is usually not a direct measurement of entropy, it is an inference that helped move forward atomic theory during an era when belief in atoms was considered heresy. The Botlzmann equation was needed not so much as to define entropy but rather to vindicate atomic theory and provide an explanation for temperature in terms of the energy of atoms versus the mystical heat fluid of Caloric theory:

    https://en.wikipedia.org/wiki/Caloric_theory

    I think it is pointless to try to define thermodynamic entropy purely in terms of information theory, one needs physical observables like the energy dispersal data. Keiths, Mung and DNA_jock are in un-enviable position of needing information (energy dispersal information) to infer their lack of information! It’s a clumsy way to do business, imho.

  12. Sal,

    Come out from behind your ignore button.

    Don’t be afraid. The weather’s fine, and the worst that can happen is for you to discover that you’re wrong — again.

  13. keiths: What’s the point? It’s an analogy.

    I guess it’s not a very good analogy, if one can be calculated and one can’t.

    Or maybe you meant, “Oh what’s the point, I could do it if I wanted to, but I don’t want to.” {I don’t think so, though. That is a point Sal has been making on this thread from the OP on.}

    Let me put this to you this way. This issue is either a quibble, or you’re complete wrong about it.

    You can pick.

  14. keiths,

    colewd:

    Can you explain?

    Here’s an analogy:

    You’re trying to locate a buried treasure. The treasure is at some exact, unknown location. That’s equivalent to the microstate.

    All you know is that the treasure is within a five-mile radius of the oak tree in Old Man Brummer’s pasture. That’s equivalent to the macrostate. It’s a summary of what you actually know about the exact microstate.

    The entropy is the difference between the two — how much extra information it would take to narrow down the location of the treasure to a particular point.

    Note that you can express the entropy without giving away either the macrostate or the microstate. A friend of yours knows that you’re searching for the treasure and asks about your progress. You can say “I’ve located it to within a radius of five miles” without adding the part about Old Man Brummer’s oak tree. The entropy is just a measure of the uncertainty — the amount of missing information.

    I think I got this. Can you show me how this relates to the efficient transfer of energy as described in the second law of thermodynamics?

  15. stcordova,

    I think it is pointless to try to define thermodynamic entropy purely in terms of information theory, one needs physical observables like the energy dispersal data. Keiths, Mung and DNA_jock are in un-enviable position of needing information (energy dispersal information) to infer their lack of information! It’s a clumsy way to do business, imho.

    Several people have defined entropy as a lack of information of a micro state. For the life of me I am struggling to tie this back in any way to the second law of thermodynamics that describes energy efficiency. Can you help? I have asked Keith’s the same question.

  16. walto,

    I guess it’s not a very good analogy, if one can be calculated and one can’t.

    Who said it couldn’t be calculated? I asked you what the point was, since it’s just an analogy.

  17. colewd,

    The connection between entropy and the Second Law is direct: the Second Law says that the entropy of an isolated system is overwhelmingly likely to either remain the same or increase. In other words, the gap between the macrostate and the exact microstate — the missing information — will stay the same or increase.

    Think of the gas-mixing case again. The instant the partition is removed, all of the gas A molecules are on one side of the container, while all of the gas B molecules are on the other. After the gases mix and the system reaches equilibrium, gas A molecules are distributed throughout the container, and so are gas B molecules. There are many more ways of arranging the molecules so that A and B are mixed than there are of arranging them so that the A and B molecules are segregated into the two halves of the container. That means that there are many more microstates compatible with the ‘mixed’ state than there are microstates compatible with the ‘segregated’ state.

    Before mixing there were a certain number of possible microstates. The exact microstate had to be one of them. Now there are many more possible microstates, but the exact microstate still has to be one of them. Since the number of possible microstates has increased, our uncertainty — our missing information — has also increased. That is, the entropy has increased.

  18. keiths:
    walto,

    Who said it couldn’t be calculated?I asked you what the point was, since it’s just an analogy.

    So do it, then.

  19. keiths, you’re making the same mistake here you made on the phoodoo decision thread (which you came to see, though never admitted). Think about what is responsible for what here–the information paucity is a result of the physical properties of the microstates and a perspective. It’s not the other way around. Only one is first in the order of being as the philosophers say.

    You’ve got maps causing territories. The maps here are just an elegant way of expressing the nature of the territories. That’s why you can’t do any of these calculations that Sal or Coelwd or I am requesting. You’re starting from the wrong side of things–the res cogitans.

  20. colewd: Several people have defined entropy as a lack of information of a micro state.

    I hope you’re not including me in that. 🙂

  21. colewd: For the life of me I am struggling to tie this back in any way to the second law of thermodynamics that describes energy efficiency.

    There are numerous formulations of the second law. How may of them are stated in terms of the efficiency of a heat engine?

  22. walto,

    keiths, you’re making the same mistake here you made on the phoodoo decision thread (which you came to see, though never admitted).

    Huh? I think you’re imagining things, walto.

    Think about what is responsible for what here–the information paucity is a result of the physical properties of the microstates and a perspective. It’s not the other way around…

    Of course. Who would ever claim that the information gap causes the physical properties of the microstates? Or that it causes the observer to select a certain macrostate?

    You’ve got maps causing territories.

    No, it’s the other way around. If you disagree, then provide a quote in which I indicate otherwise.

  23. keiths,
    Then you must agree that your claim that thermodynamics is REALLY a measure of missing information is a quibble. Bravo!

  24. keiths: Huh? I think you’re imagining things, walto.

    Hah. You’re now denying that you endorsed both logical determinism and epistemic determinism before abandoning them both with nary a word of farewell?

    You’re sort of doing the same thing here, though it’s been more like pulling teeth on this thread. I tell you, I’m not paid enough for this kind of work.

    #sigh

  25. walto,

    Then you must agree that your claim that thermodynamics is REALLY a measure of missing information is a quibble.

    I think you meant “entropy”, not “thermodynamics”. Slow down, dude.

    And no, it’s not a quibble. It’s true that entropy is a measure of missing information, and it’s false that it’s a measure of energy dispersal. The MI view succeeds in cases where the ED view fails. There are multiple reasons to discard the ED view as flawed, but no such reasons to discard the MI view.

    And remember, you were the one arguing that if I were correct, the scientific paradigm would have to change and the textbooks would have to be rewritten. Now you’re claiming it’s just a quibble.

    Which is it? Quibble or paradigm change?

    And who’s right, the walto who agrees with Jaynes, or the walto who disagrees?

  26. walto,

    Hah. You’re now denying that you endorsed both logical determinism and epistemic determinism before abandoning them both with nary a word of farewell?

    You misunderstood my position in that thread, just as you’re doing here. The kind of close, rigorous thinking required in both cases is clearly not your strength.

  27. I’ve already told you. Jaynes is right, and unless this whole matter is a quibble you are completely confused and wrong.

    I guess you’ve chosen, though. I repeat my sigh.

  28. Anyhow, gotta deal with halloween right now. You can continue your own delusional horror story without me.

  29. Hopefully the trick-or-treaters will degrumpify you, so that you can deal with this more rationally when you return.

  30. stcordova: Not directly since Mung and Keiths appeal to non-existent microstate meters and microstate counters and Shannon-entropy meters.

    What a load of crap. Sal is either clueless or just making stuff up. The foundation of the information theory approach is probabilities, and Salvador’s complaining that no one has yet invented a probability meter. Duh.

    Salvador, on the other hand, keeps providing calculations rather than readings from his entropy meter.

  31. keiths:
    walto,

    You misunderstood my position in that thread, just as you’re doing here.The kind of close, rigorous thinking required in both cases is clearly not your strength.

    Hahaha. Same old keiths.

  32. In practice, we are not usually interested in the surprise of a particular value of a random variable, but we would like to know how much surprise, on average, is associated with the entire set of possible values. That is, we would like to know the average surprise defined by the probability distribution of a random variable. The average surprise of a variable X which has a distribution p(X) is called the entropy of p(X), and is represented as H(X). For convenience, we often speak of the entropy of the variable X, even though, strictly speaking, entropy refers to the distribution p(X) of X.

    Information Theory: A Tutorial Approach

  33. walto,

    I’ve already told you. Jaynes is right…

    Yes, which means that you are wrong. The following is not correct:

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    You think that there’s a single value of entropy that Xavier, Yolanda, and Damon are all aiming at. Jaynes knows better, as you yourself pointed out:

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for.

    Yolanda considers isotopic concentration to be relevant. Xavier does not. They get different entropy values, and each is correct.

    Jaynes gets that. Why don’t you?

  34. Several people have defined entropy as a lack of information of a micro state. For the life of me I am struggling to tie this back in any way to the second law of thermodynamics that describes energy efficiency. Can you help? I have asked Keith’s the same question.

    The reason you are struggling is that “lack of information of a microstate” is convoluted and confusing way to describe entropy.

    The problem is one can’t compute “lack of information” without already having “sufficient information” (like temperature, heat input, volume, number of particles) to compute the entropy.

    But If one can compute “lack of information of the microstate”, that means one already has sufficient information needed to compute entropy in J/K!

    And if one already has information needed to compute entropy in J/K, what’s the point of computing “lack of information of the microstate” in bits?

    This is like going forward in order that one can go backward.

    Ok, so I can calculate the entropy of a mol of helium at 300 K in 1 cubic meter as 156.93 J/K.

    http://creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    Here is how someone like Keiths can go around in circles to compute the number of bits of ignorance of a mol of helium at 300K in 1 cubic meter and work backward to entropy expressed in J/K that is more akin to the 2nd law.

    1. compute the entropy expressed in J/K
    Answer: 156.93 J/K using the Excel spreadsheet provided

    2. Now use the entropy expressed in J/K to compute Shannon bits
    Answer: 156.93 J/K / kB/ ln(2) = 1.64 x 10^25 bits

    3. now use Shannon bits to compute backward toward the entropy in J/K
    1.64 x 10^25 x kB x ln(2) = 156.93 J/K

    What a pointless waste of time.

    PS

    The “Shannon bits” is the amount of ignorance one has about the exact microstate the system is in. But as I said, this is a convoluted way to understand thermodynamic entropy.

  35. Keiths seems to be under the mistaken impression that thermodynamic entropy is solely a measure of lack of information of a microstate, that there cannot be an alternate description of entropy purely in terms of heat and temperature like Claussius did. And Classius was the guy who coined the word “entropy.”

    If one can calculate entropy in terms of heat and temperature, then one is mistaken to insist entropy can only be described solely in terms of lack of information about a microstate. This is false, because Claussius defined entropy without any reference to micorstates!

    Classius definition:

    dS = dq(reversible) / T

    which clearly is not intended to mean “entropy is a measure of missing information”. And that definition has stood the test of time and it is the textbook definition that is most widely used in practice.

    If Keiths wants insist entropy is solely “lack of information of the microstate” then how is it entropy can be computed without first computing “lack of information of the microstate”?

    But entropy can be computed without first computing “lack of information of the microstate” so Keiths is wrong to insinuate entropy is only describable as “lack of information of the microstate” to the exclusion of other definitions like Claussius (which is basically at the heart of the energy dispersal qualitative description).

  36. Almost every post of Salvador’s in this thread has made me laugh. We do have to consider the amusement value, don’t we?

    Salvador:

    Keiths seems to be under the mistaken impression that thermodynamic entropy is solely a measure of lack of information of a microstate, that there cannot be an alternate description of entropy purely in terms of heat and temperature like Claussius did.

    That is incorrect. Keith knows one can use the Clausius calculation to calculate the change in entropy and has said so repeatedly.

    Of course, Salvador has keiths on Ignore, so how would Sal know what keiths thinks?

    But Clausius only tells us how to calculate change in entropy, and only in some cases, it does not tell us what entropy is, and Clausius doesn’t define entropy as “energy dispersal,” which is something Salvador conveniently forgets every time he bring it up.

    Salvador:

    If one can calculate entropy in terms of heat and temperature, then one is mistaken to insist entropy can only be described solely in terms of lack of information about a microstate.

    See. Even Salvador knows entropy is calculated. It’s odd that he thinks there’s only one way to calculate entropy. In fact, we know it’s false that there’s only one way to calculate entropy. We also know that Sal knows that it is false that there is only one way to calculate entropy.

    It would also help if Sal could be more clear when he’s talking about definitions, calculations, and descriptions. how does he describe entropy in terms of heat and temperature? Entropy is the spreading of heat?

  37. stcordova: If Keiths wants insist entropy is solely “lack of information of the microstate” then how is it entropy can be computed without first computing “lack of information of the microstate”?

    More silliness from Sal! The fact that Salvador is appealing to the macrostate variables itself establishes that we are in the presence of missing information.

  38. stcordova: The problem is one can’t compute “lack of information” without already having “sufficient information” (like temperature, heat input, volume, number of particles) to compute the entropy.

    So? That’s not a problem. Why is that a problem?

    But If one can compute “lack of information of the microstate”, that means one already has sufficient information needed to compute entropy in J/K!,

    Yes, so? That what we’ve been trying to tell you.

    And if one already has information needed to compute entropy in J/K, what’s the point of computing “lack of information of the microstate” in bits?

    IOW, if we can use the information theory approach to calculate entropy, what is the point of using the information theory approach to calculate entropy. That’s your question?

    Salvador knows that if you have the value in bits you can convert it into units of J/K. He’s shown us how to do so, though in the reverse direction.

    So where’s the beef?

  39. stcordova,

    Yes, keiths is confused about this matter. He made a similar howler on (the quite interesting) “The Purpose of Theistic Evolution” thread. He now denies this, of course, but, as they say, “ball don’t lie”:

    keiths: If God knew the outcome before it occurred, then it would be determined, and all other possibilities would be ruled out.

    To which I replied:

    walto: Why must it be determined just because somebody knows before hand what will happen? How does knowledge determine anything? Suppose somebody is remarkably good at predicting the future–or suppose somebody can just “premember” it, the same way we remember the past. How is that determining anything. Determination requires prior causes. Knowledge isn’t that.

    Undaunted, keiths slithered from committing the fallacy of epistemic determinism right into the fallacy of logical determinism:

    keiths: The knowledge per se doesn’t determine anything, but remember that knowledge is dependent on truth.

    If somebody knows ahead of time that X will happen, that means that it’s true ahead of time that X will happen. If it’s true ahead of time that X will happen, then X will happen, and nothing else is possible.* X is determined.

    Sweetheart that I am, I politely {cough} explained this error to him:

    walto: Yeah, that’s a confusion. Things don’t happen because it’s true they’ll happen. Quite the contrary. The only thing that can make it true that I will sing tomorrow is me singing tomorrow. Now, I may be determined to sing tomorrow, but if so, those determinants will be prior causes not abstractions.

    And, soon after, I posted an excerpt from an IEP article which does a nice job explaining these (quite common) mistakes:

    walto: From the Internet Encyclopedia of Philosophy http://www.iep.utm.edu/foreknow/#H8

    The argument (Logical Determinism) that a proposition’s being true prior to the occurrence of the event it describes logically precludes free will ultimately rests on a modal fallacy. And the ancillary argument that a proposition’s being true prior to the occurrence of the event it describes causes the future event to occur turns on a confusion (i) of the truth-making (semantic) relation between an event and its description with (ii) the causal relation between two events.

    The argument (Epistemic Determinism) that a proposition’s being known prior to the occurrence of the event it describes logically precludes free will, as in the case of logical determinism, ultimately rests on a modal fallacy. And the arguments that it is impossible to know the future are refuted by two facts. One is that we do in fact know a very great deal about the future, indeed our managing to keep ourselves alive from hour to hour, from day to day, depends to a very great extent on such knowledge. Two is that the objection that we cannot have knowledge of the future – because our beliefs about the future ‘might’ (turn out to) be false – turns on a mistaken account of the role of ‘the possibility of error’ in a viable account of knowledge. Beliefs about future actions, insofar as they are contingent, and – by the very definition of “contingency” – are possibly false. But “possibly false” does not mean “probably false”, and possibly false beliefs, so long as they are also actually true, canconstitute bona fide knowledge of the future.

    As I’ve said above a couple of times, I do think keiths came to understand this, and, later in the thread, he even mentions this distinction and (though he naturally doesn’t admit his prior confusion–he NEVER does that!) he credits me with making having earlier made this point.

    Sadly, though, he’s making much the same error here. It’s not that the missing information theory is wrong–it’s quite right. The thermodynamic entropy is CORRECTLY expressible as an amount of missing information. But the information is missing because particular physical characteristics are or are not exemplified in the world. Once again, it’s not the knowing or failure to know that makes it so. The res extensa come first.

    First, territory (not THE WORD).

    ETA: *I notice now, for the first time, that keiths also committed a modal fallacy where I have placed the asterisk above. I won’t bother explaining this now, since it’s not relevant. The other mistakes he made there are more than enough for our present purposes.

  40. Mung,

    So where’s the beef?

    The beef is trying to understand the practical reason for defining entropy as missing information. If someone wants to define it as observer dependent missing information thats cool but how do I use that definition to understand the natural world? Have you looked at my post showing Bekenstein’s measurement of black hole entropy?
    Do you think it is legit?

  41. colewd: The beef is trying to understand the practical reason for defining entropy as missing information. If someone wants to define it as observer dependent missing information thats cool but how do I use that definition to understand the natural world?

    Adami answers your question this way:

    The entropy of a physical object, it dawns on you, is not defined unless you tell me which degrees of freedom are important to you. In other words, it is defined by the number of states that can be resolved by the measurement that you are going to be using to determine the state of the physical object. If it is heads or tails that counts for you, then log2 is your uncertainty. If you play the “4-quadrant” game, the entropy of the coin is log8, and so on. Which brings us back to six-sided die that has been mysteriously manipulated to never land on “six”. You (who do not know about this mischievous machination) expect six possible states, so this dictates your uncertainty. Incidentally, how do you even know the die has six sides it can land on? You know this from experience with dice, and having looked at the die you are about to throw. This knowledge allowed you to quantify your a priori uncertainty in the first place.

  42. colewd: Have you looked at my post showing Bekenstein’s measurement of black hole entropy? Do you think it is legit?

    I’ve responded to it at least twice. It’s not thermodynamic entropy. It may not even be entropy in the Shannon sense.

    Another point about entropy as missing information is that it helps us understand that entropy does not cause your room to get messy or organisms to die or any of the other nonsensical things people say about entropy.

  43. colewd:

    The beef is trying to understand the practical reason for defining entropy as missing information. If someone wants to define it as observer dependent missing information thats cool but how do I use that definition to understand the natural world?

    Well, said. Imho, the “missing information” definition is not much use to guys who actually apply the concept of entropy in their day jobs like doing chemistry and building air conditioners, heat pumps, refrigerators or anything that converts heat to mechanical energy or mechanical energy to heat, or convert heat to chemical energy or the reverse. For guys that make a living by applying entropy concepts, entropy is better defined by heat and temperature and not missing information.

    Figuratively speaking:

    Clausius Entropy (heat and temperature) = Boltzmann entropy (natural log of microstates) = Shannon entropy (log base 2 of probability of a microstate) = Shannon entropy (missing information)

    Professionals who work with entropy ( engineers and chemists) use the Claussius version. The mistake I see in this thread is the insistence there is only one way to define entropy correctly. There is not.

    The Bekenstein-Hawking entropy is for cosmologists, and there’s an old saying, “there is speculation and then there is cosmology.”

  44. Mung,

    I’ve responded to it at least twice. It’s not thermodynamic entropy. It may not even be entropy in the Shannon sense.

    So I guess we need to take a lot of modern cosmology with a grain of salt. 🙂

    Penrose, Suskin, Hawking back to the drawing board.

  45. stcordova,

    Professionals who work with entropy ( engineers and chemists) use the Claussius version. The mistake I see in this thread is the insistence there is only one way to define entropy correctly. There is not.

    I am interested to see if others agree with this. The multiple definition issue seems to be the cause of this lively discussion.

Leave a Reply