In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,697 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Mung,

    In fact, all of your examples rely on summing the values shown on the dice. Why?

    No they don’t, and they don’t need to.

    Let r be the number on the red die and g the number on the green. The following are all legitimate macrostates:

    1. r and g are the last two digits, in order, of the year in which the Japanese surrendered to the US at the end of WWII.

    2. g is prime.

    3. r raised to the g is exactly 32.

    4. g minus r is exactly one.

  2. keiths:

    Their metaphysical probabilities are now zero…..

    You don’t know that….

    Now suppose your accurate, honest friend tells you that the sum of the numbers on the two die faces is greater than eight….

    .Finally she tells you that the number on the green die is a three….

    Suppose I don’t know how honest my friend is, or whether she has a good vantage and good light when she divulges this apparent information to me. Or maybe she’s tired or tired of this game. What are the exact macrostates in those cases? Is there no exact macrostate in all those cases? In fact, as macrostates seem to be a function of knowledge in your view, is there never an exact macrostate for one who doesn’t believe people ever know anything but can only sometimes know* them? (And does it matter that “know*” can’t actually be defined without ellipses, and nobody is in a position to tell whether they ever actually know* anything, but can only sometimes know* that they don’t know things?)

  3. walto,

    Suppose I don’t know how honest my friend is, or whether she has a good vantage and good light when she divulges this apparent information to me.

    Did you miss the “accurate and honest” part? You even quoted it:

    Now suppose your accurate, honest friend tells you that the sum of the numbers on the two die faces is greater than eight…

  4. walto: What are the exact macrostates in those cases? Is there no exact macrostate in all those cases? In fact, as macrostates seem to be a function of knowledge in your view, is there never an exact macrostate for one who doesn’t believe people ever know anything but can only sometimes know* them?

    “If a tree falls in a forest and no one is around to hear it, does it make a sound?”. Perhaps macrostates are simply a human mathematical construct and only microstates exist.

  5. Alan Fox: Perhaps macrostates are simply a human mathematical construct and only microstates exist.

    I think they supervene on microstates. But that doesn’t entail that they change when we’re drunk or sleepy–any more than, say, apples do when they look blurry.

    Just because entropy is a measure of paucity of knowledge, doesn’t mean macrostates exist only in the mind. What we KNOW about them exist only in minds.

    When trees fall in the forest, they make sounds. You can leave your phone there to record them if you like. Making a sound isn’t the same thing as being heard.

  6. FWIW,

    A good example of the main macrostates of interest are:

    Temperature
    Volume
    Quantity of Substance

    Chagnes in temperature are used to determine heat flow for calorimetry from which we determine dQ (heat transferred in or out of a system).

    So those are the 3 major macrostates for thermodynamics. There are probably others, but many of the examples I worked out in this discussion use only those three as illustrated in the orange cells of my Sakure-Tetrod Excell spreadsheet:

    http://www.creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

  7. Incidentally, I think it is ridiculous all the criticism of the Claussius notion of entropy. Much of real world entropy measurements are made in terms of the macrostate variables used by Claussius of:

    Temperature
    Quantity of Substance
    Volume

    It’s not made via direct counting of microstates or “missing information” meters! We usually infer the number of microstates or “missing information” by taking the 3 macrostate variables and computing the number of microstates or “missing information.”

    As I said, the “missing information” definition is not of much practical use. The computation of the number of microstates is useful for estimating molecular structure based on macrostate data — an example was Pauling using macrostate data to determine the approximate molecular structure of ice. But one doesn’t usually calculate microstates first because one needs to know molecular structure in advance to do the computation, and that structure is usually what’s in question, not the measured Claussius entropy!

  8. Sal,

    As I said, the “missing information” definition is not of much practical use.

    The “entropy as missing information” view is correct. The “entropy as disorder” and “entropy as energy dispersal” views are incorrect. They don’t work.

    The choice is easy, unless you’re Sal.

  9. On Mung’s thread, Patrick writes:

    Mike Elzinga has a great entropy concept test:

    The following is an elementary concept test.

    Take a simple system made up of 16 identical atoms, each of which has a non-degenerate ground state and a single accessible excited state.

    Start with all of the atoms in the ground state.

    1. What is the entropy when all atoms are in the ground state?

    2. Add just enough energy to put 4 atoms in the excited state. What is the entropy now?

    3. Add more energy so that 8 atoms are in the excited state. What is the entropy now?

    4. Add still more energy so that 12 atoms are in the excited state. What is the entropy now?

    5. Add more energy so that all atoms are in the excited state. What is the entropy now?

    6. Rank order the temperatures in each of the above cases.

    After completing the test, generalize the number of atoms to an arbitrary size N and plot the temperature versus energy.

    Explain what is “ordered” or “disordered” and what is “information” in the above calculations.

    It would be interesting to see the various participants in the other thread provide their answers. I volunteer to hold them all for simultaneous release.

    Mike asks:

    Explain what is “ordered” or “disordered” and what is “information” in the above calculations.

    No one here believes that entropy is a measure of disorder, as far as I can tell. Regarding information, the answer is simple, and it’s the same one I’ve been giving throughout the thread: Entropy is a measure of missing information — the gap between what the macrostate tells you and what is needed to pin down the exact microstate.

    The problems with the “entropy as energy dispersal” view have already been pointed out.

  10. Mike E.: Explain what is “ordered” or “disordered” and what is “information” in the above calculations.

    keiths:Regarding information, the answer is simple, and it’s the same one I’ve been giving throughout the thread: Entropy is a measure of missing information — the gap between what the macrostate tells you and what is needed to pin down the exact microstate.

    And there you have it, patrick! Give him a harder one!

  11. keiths,

    Sal,

    As I said, the “missing information” definition is not of much practical use.

    The “entropy as missing information” view is correct. The “entropy as disorder” and “entropy as energy dispersal” views are incorrect. They don’t work.

    Can you challenge Sal’s argument that the missing information definition is not of much practical use?

  12. In one of the few time I un-ignored by logging out, I ventured to look at Keiths and Mungs latest drivel.

    I never said the “missing information” definition is inherently wrong, it’s just about next to useless.

    How does Keiths calculate entropy of a melting ice cube without using those equations NOT framed in terms of missing information — like, eh, dQ/T!

    Keiths is relegated to using equations like

    delta-S = Integral (dQ/T)

    that have no regard for information theory in order for Keiths to calculate his missing information. Too funny. He proves again that near lack of any utility of this missing information definition.

    I was one of the first at TSZ to relate formally information theory Shannon bits to Joule/Kelvins explicitly. If bits were useful, they’d be used as a measure of entropy in industry, but they are not. They are usually in Joule/Kelvin to reflect the importance of the Claussius definition for utility, not theoretical abstractions.

    The Boltzmann definition is important as it allows Claussius measurements to give possible insight to atomic and molecular details (like Pauling deducing the molecular details of ice).

    Let Keiths cite a major industrial application in chemistry or material science or protein science that uses his “missing information” definition and does the calculation in bits. In contrast I could site thousands of papers that are rooted in the Claussius definition (aka energy dispersal).

    How about Keith show where the Gibbs free energy equation is used by chemists with the “missing information” definition. Absurdity!

    How about Keiths start with college enegineering, chemisty, and physics textbooks.

    Let Mung calculate the entropy of a melting ice cube using his missing information approach instead of the Claussius approach he disparages.

    Keiths is relegated to using equations that don’t have regard to “missing information” in order to calculate the “missing information” he’s so fixated on. Too funny.

    Despite being told this repeatedly, he can’t seem to realize this is testament to the fact “missing information” is a superflous and gratuitous add-on of little utility except to Keiths and Mung to have excuses to bloviate about stuff they can’t even themselves calculate from their own definitions.

    Calculate the entropy of melting ice in such a way that shows the necessity of information theory. Can’t be done, because the counter example of

    dS = dQ/T

    suffices, and it has no need of information theory. Hence, information theory is not absolutely essential to define entropy in practical applications.

  13. colewd,

    Can you challenge Sal’s argument that the missing information definition is not of much practical use?

    Sure. Sal’s making the same mistake he’s been making throughout the thread. How you define entropy does not determine how you measure it. Those are separate questions.

    Sal’s babbling about “missing information meters” is just that — babbling.

    You’ll notice that he keeps trying to portray this as some kind of contest between Clausius and Boltzmann/Gibbs entropies. It’s not. As Sal himself has pointed out, they’re linked. Statistical mechanics provides the theoretical underpinning that classical thermodynamics lacks. Statistical mechanics explains Clausius entropy.

  14. How you define entropy does not determine how you measure it. Those are separate questions.

    Claussius defined entropy through:

    ds = dQ/T

    He also was the one who coined the word “entropy”, not Shannon, not Boltzmann!

    You got a problem with that?

    That definition doesn’t use “missing information” notions. Keiths can’t get it through his head one definition (Claussius) doesn’t exist to the exclusion of other definitions (Boltzmann, Shannon, etc.)

  15. So Keiths, show the readers how you calculate the “missing information” entropy of a 20 gram melting ice cube and get

    24.39 J/K

    What? You have to use the

    dS = dQ/T

    definition which has no reference to your “missing information” definition, in order to compute your “missing information” entropy? Too funny.

    Like colewd pointed out, you have yet to prove the practical utility of the “missing information” vs. the Old School definition defined in terms of heat and temperature, as in:

    dS = dQ/T

    where

    S = entropy
    Q = heat
    T = temperature

    Claussius coined the word “entropy”, so don’t be too quick to diss his definition, especially since the Keiths approach can’t answer trivial entropy problems without resorting first to the Claussius definition.

  16. Sal:

    Claussius [sic] defined entropy through:

    ds = dQ/T

    He also was the one who coined the word “entorpy” [sic], not Shannon, not Boltzmann!

    You got a problem with that?

    Science progresses, Sal. We now understand that temperature corresponds to average kinetic energy, and that entropy is a measure of missing information. You yourself have admitted that the “missing information” interpretation is correct.

    All that’s left is for you to acknowledge that the “energy dispersal” view of entropy is incorrect.

  17. Poor Salvador has absolutely no answers, not even ones that ought to be very easy for him. He should go back to Ignoring us.

  18. keiths: 1. You’re blindfolded. You vigorously shake a standard pair of fair dice, one red and one green, and toss them onto a craps table. They come to rest in one particular configuration: there’s a six on the red die and a three on the green die. That is the exact microstate.

    That’s the outcome of the experiment. I would not call it a microstate, though I might be willing to take the sum of the two dice and call that a macrostate. But I don’t think it matters, because it’s not relevant to calculating the entropy.

  19. Patrick asks:

    Care to take a stab at answering those questions from your “subjective knowledge” perspective?

    My position is not that entropy is subjective, but rather that it is observer-dependent. It’s a measure of the gap between the macrostate — what an observer knows about the exact state of the system — and what would be required to pin down the exact microstate.

    Can you think of any scenario, including Mike’s, in which entropy is not a measure of the missing information as described above? If so, please present it along with an explanation of why you think the missing information interpretation cannot successfully be applied to it.

  20. keiths: 3. You don’t know that. No one has told you anything about the result of the throw, and you are still blindfolded. As far as you know, the dice could be in any of the 36 possible microstates, with equal epistemic probability.

    Heck, for me, it’s as if the dice haven’t even been tossed at all, right?

    W for you is 36, and the entropy is

    S = log2(36) ≈ 5.17 bits

    I disagree. I gave you the value I think is correct for the entropy. The entropy is 3.27 bits/symbol.

    The entropy you calculate is not a function of the metaphysical probability distribution, but of the epistemic probability distribution, which for you is currently a distribution that assigns a probability of 1/36 to each of the 36 possible microstates.

    There are 36 ways to arrange the two dice, but that doesn’t make the probability distribution uniform, and the entropy is calculated from the distribution.

  21. Mung,

    There are 36 ways to arrange the two dice, but that doesn’t make the probability distribution uniform…

    Sure it does. If the dice are fair and thrown randomly, as we’ve specified, then the 36 microstates are equiprobable.

  22. keiths:

    1. You’re blindfolded. You vigorously shake a standard pair of fair dice, one red and one green, and toss them onto a craps table. They come to rest in one particular configuration: there’s a six on the red die and a three on the green die. That is the exact microstate.

    Mung:

    That’s the outcome of the experiment. I would not call it a microstate…

    Why not?

  23. keiths:

    W for you is 36, and the entropy is

    S = log2(36) ≈ 5.17 bits

    Mung:

    I disagree. I gave you the value I think is correct for the entropy. The entropy is 3.27 bits/symbol.

    You had it right originally, but you changed your answer.

    What you’re quoting there — the 3.27 bits/symbol number — is the average number of bits per symbol for a code where each symbol represents the sum of two fair dice thrown together.

  24. keiths,

    Why are you afraid to quote this comment to Salvador?

    You don’t have much confidence in him, do you?

    I understand Sal’s arguments, however I am struggling with yours. I understand defining entropy as missing information, however I still have no idea what you do with the understanding of how much missing information is there from a observers point of view. I keep asking because it maybe just my lack of understanding.

  25. colewd:

    I understand defining entropy as missing information, however I still have no idea what you do with the understanding of how much missing information is there from a observers point of view. I keep asking because it maybe just my lack of understanding.

    We use it in all of the ways we currently use entropy. Nothing changes except our understanding of what entropy actually is. It isn’t a measure of disorder, and it isn’t a measure of energy dispersal. It’s a measure of missing information, which means that it can vary from observer to observer. The Second Law still holds, the world continues to turn, everything is fine — we’ve just jettisoned an incorrect understanding of entropy and replaced it with an accurate one.

  26. keiths: Mung:

    That’s the outcome of the experiment. I would not call it a microstate…

    Why not?

    Because we’ve already agreed there are 36 ‘microstates’, not one. Tossing a pair of dice does not change the probability of rolling a seven from 6/36 to 36/36.

    Why not call “there’s a six on the red die and a three on the green die” the macrostate?

    By the way, in the game of craps, the dice are the same color, and the color doesn’t matter when calculating the odds, the payoffs, or the expected value from any given wager. But you know that, I think.

  27. keiths: You had it right originally, but you changed your answer.

    🙂

    What you’re quoting there — the 3.27 bits/symbol number — is the average number of bits per symbol for a code where each symbol represents the sum of two fair dice thrown together.

    Yes. Exactly. Thank you for listening.

    I know the figure and I know how to calculate it and I know why I am performing the calculations. And that’s how I am deciding to calculate the entropy. The Shannon entropy.

    Why am I wrong?

  28. colewd: I understand Sal’s arguments, however I am struggling with yours.

    I don’t understand Salvador’s arguments, and I am a bona fide genus.

    Salvador says:

    Incidentally, I think it is ridiculous all the criticism of the Claussius notion of entropy. Much of real world entropy measurements are made in terms of the macrostate variables…

    Yeah, I’ve been pointing out the relevance of the macrostate variables throughout this thread. Somehow Salvador manages to turn that into a claim that there is “criticism of the Claussius notion of entropy” going on.

    I don’t understand that argument. Do you?

  29. keiths:

    keiths:

    1. You’re blindfolded. You vigorously shake a standard pair of fair dice, one red and one green, and toss them onto a craps table. They come to rest in one particular configuration: there’s a six on the red die and a three on the green die. That is the exact microstate.

    Mung:

    That’s the outcome of the experiment. I would not call it a microstate…

    keiths:

    Why not?

    Mung:

    Because we’ve already agreed there are 36 ‘microstates’, not one.

    Yes, and (red=6, green=3) is one of them. It’s a microstate.

    This is not that difficult, Mung.

  30. Mung,

    Why am I wrong?

    Because you’ve been cribbing from a source that you don’t understand.

    As I said:

    What you’re quoting there — the 3.27 bits/symbol number — is the average number of bits per symbol for a code where each symbol represents the sum of two fair dice thrown together.

    We are dealing with a situation where two microstates — say, (red=5, green=2) and (red=3, green=4) — are distinct even if they sum up to the same value — in this case 7.

  31. keiths: Yes [there are 36 ‘microstates’], and (red=6, green=3) is one of them. It’s a microstate.

    This is not that difficult, Mung.

    Oh good. So you won’t be telling me it is THE microstate. It’s one of 36 possible microstates and that doesn’t change just because I tossed a pair of dice and someone else observed the outcome.

    The entropy (uncertainty) associated with the outcome of the toss of a pair of dice has not changed just because a pair of dice was tossed.

    By the way, why not try to teach your concept of entropy from the toss of a single coin or the toss of a single die, or some other scenario where all outcomes are equally likely?

  32. Sigh.

    Read it again, Mung:

    See if you can follow my argument:

    1. You’re blindfolded. You vigorously shake a standard pair of fair dice, one red and one green, and toss them onto a craps table. They come to rest in one particular configuration: there’s a six on the red die and a three on the green die. That is the exact microstate.

    2. The dice have been thrown, and they landed with a six on the red die and a three on the green die. It’s over and done. The metaphysical probability of (red=6, green=3) is now one. That’s how things turned out; the other microstates are no longer live possibilities. Their metaphysical probabilities are now zero.

    3. You don’t know that. No one has told you anything about the result of the throw, and you are still blindfolded. As far as you know, the dice could be in any of the 36 possible microstates, with equal epistemic probability. W for you is 36, and the entropy is

    S = log2(36) ≈ 5.17 bits

    Even though there is one exact microstate, with a metaphysical probability of one, you don’t know which microstate that is. The entropy you calculate is not a function of the metaphysical probability distribution, but of the epistemic probability distribution, which for you is currently a distribution that assigns a probability of 1/36 to each of the 36 possible microstates.

    4. Now suppose your accurate, honest friend tells you that the sum of the numbers on the two die faces is greater than eight. The metaphysical probabilities haven’t changed. The metaphysical probability of (red=6, green=3) is still one, and the metaphysical probabilities of the other microstates are zero.

    The epistemic probabilities have changed dramatically, however. Now you know that the microstate must be one of the following, in (red,green) format:

    (3,6),
    (4,5), (4,6),
    (5,4), (5,5), (5,6),
    (6,3), (6,4), (6,5), (6,6)

    Before there were 36 epistemic possibilities. Now there are only 10. The entropy is now reduced to

    S = log2(10) ≈ 3.32 bits

    You gained some relevant information when she told you that the sum was greater than eight. There’s less missing information, and hence less entropy.

    5. Finally she tells you that the number on the green die You’re down to one possible microstate: (6,3). The other microstates have all been ruled out. W for you is now one, and the entropy is

    S = log2(1) = 0 bits

    The information she gave you was enough to pin down the exact microstate. There is no longer any missing information, so the entropy is zero.

    Entropy is observer-dependent, and it decreases as the observer gains relevant information about the actual microstate.

  33. keiths: Because you’ve been cribbing from a source that you don’t understand.

    I’m sorry, but that answer just doesn’t cut it. You seem to agree with the value I arrived at and the process of reasoning I used to arrive at that value. So all that’s left to dispute is whether the value given is the Shannon entropy.

    For that you need to make an argument.

    You could start by contributing your own code and tests for calculating Shannon entropy in the Dice Entropy thread, or at least explaining there why mine is wrong.

    I understand the requirements well enough to code them. You need to explain why my code does not calculate Shannon entropy.

  34. Mung,

    The entropy (uncertainty) associated with the outcome of the toss of a pair of dice has not changed just because a pair of dice was tossed.

    Right. What reduces the entropy is not that the dice have been tossed, but rather that your friend has given you feedback about the microstate. For example, if she (accurately and honestly) tells you that the number on the red die is greater than the number on the green die, then you have gained information about the actual microstate, because some of the 36 microstates have been ruled out. It is this gain in information — a reduction in the missing information — that constitutes a reduction in the entropy.

    By the way, why not try to teach your concept of entropy from the toss of a single coin or the toss of a single die, or some other scenario where all outcomes are equally likely?

    Um, that’s what I’m doing. When you randomly toss two fair dice, each of the 36 microstates is equally likely. Each has a probability of 1/36.

    Slow down and think about this for a while, Mung.

  35. keiths: We are dealing with a situation where two microstates — say, (red=5, green=2) and (red=3, green=4) — are distinct even if they sum up to the same value — in this case 7.

    Yes, if the ‘microstates’ were not distinct they would not be distinct “microstates”. And the ‘microstates’ that do sum up to the same value are not the same, just as in actual thermodynamics not all microstates are equally probable.

    You need to find a reason to reject the probability distribution I have chosen based upon the information I have. For example, I know that 5+2=7 and I know that 3+4=7, and I know that there are four other combinations of a pair of dice that can sum to 7 ({1,6},{6,1},{2,5},{4,3}).

    By the way, my version of the entropy of a pair of dice isn’t observer-dependent. Is that why you object to it so much?

  36. keiths: Slow down and think about this for a while, Mung.

    LoL. What you are telling me is that I need to recalculate the entropy when someone else tells me what they observed.

    Slow down and think about this for a while, Keith.

    Was my initial calculation wrong? Was the observation I used to calculate the entropy wrong? Was Xavier wrong? Was Yolanda wrong? Was Damon wrong?

    None of them were wrong.

  37. Mung,

    Entropy is a measure of missing information, so of course it changes when you gain relevant information. The missing information is reduced, and therefore so is entropy.

  38. Mung,

    Seriously keiths, do you not see the irony?

    Think, Mung. The fact that entropy is observer-dependent does not mean that it cannot be miscalculated.

  39. keiths: Entropy is a measure of missing information, so of course it changes when you gain relevant information. The missing information is reduced, and therefore so is entropy.

    So all sums of a pair of dice are not equally probable, and not everyone knows this. But the sums of a pair of dice are available to all (and objective) and that the sums are not all equally probable is apparent to anyone who has played the game of craps (objective).

    So why is someone who does not know that all sums of a pair of dice are not equally probable wrong when they calculate the entropy?

    My initial calculation of the entropy was based on my lack of information regarding whether the values could be summed. Silly me. Was I wrong?

    That the values can be summed hardly seems to be observer-dependent, and what those sums are hardly seems to be observer-dependent. Nor is the probability distribution observer-dependent.

    I gained more information, and my calculation of the entropy was reduced.

    So why was my new calculation of the entropy “wrong” when my initial calculation of the entropy was “right”?

    Xavier, Yolanda, Mung, Damien.

  40. keiths: Think, Mung. The fact that entropy is observer-dependent does not mean that it cannot be miscalculated.

    Yolanda, Xavier and Damien all managed to be “not wrong” when they calculated entropy. You need to present an argument for why my calculation is wrong.

    What you’re quoting there — the 3.27 bits/symbol number — is the average number of bits per symbol for a code where each symbol represents the sum of two fair dice thrown together.

    So? It’s based on the available information. How is it a miscalculation?

  41. Mung,

    There are 36 possible microstates. Each microstate is an ordered pair of two numbers, r and g, where r is the number on the red die and g is the number on the green die.

    (3,4) is a microstate. 7 — the sum of r and g — is not a microstate.

    To calculate the entropy, you need an epistemic probability distribution over the microstates. Instead, you are using an epistemic probability distribution over the sums.

    You are solving the wrong problem.

  42. keiths: (3,4) is a microstate. 7 — the sum of r and g — is not a microstate.

    I never claimed that the sum is a microstate.

    There are 36 possible ‘microstates’. All 36 possible ‘microstates’ are equally probable. But the sums of the values are not equally probable. Every craps player knows this.

    Given that we can calculate the sums we can also determine the probability distribution. Give that we can determine the probability distribution we can calculate the entropy.

    This leaves you off somewhere in no mans land.

    My position is objective, and is not “observer-dependent.”

  43. Mung,

    If you want to calculate the entropy, you need an epistemic probability distribution over the microstates, not over the sums.

    If you can’t understand macrostates, microstates, and the roles that they play, you are never going to get this.

  44. You yourself have admitted that the “missing information” interpretation is correct.

    But not to the exclusion of other definitions like you’re doing, in fact, it is highly important entropy can be defined in disparate ways. There are some classic analogies of disparate definitions of the same thing.

    i.e.

    Energy can be equated with mass times the speed of light squared:

    E= mc^2

    or optics related to electricity and magnetism through Maxwell’s Equations:

    https://en.wikipedia.org/wiki/Maxwell%27s_equations

    Or Entropy defined in terms of heat and temperature (Claussius) related to the motion of molecules and the possible microstates they can be found in (Boltzmann/Gibbs):

    delta-Q = Inetegral (dQ/T) = kB (ln W_final – ln W_initial)

    The above relation has a lot of physical significance, but tacking on Shannon:

    delta-Q = Inetegral (dQ/T) = kB (ln W_final – ln W_initial) =

    I = (ln W_final – ln W_initial) / kB / ln(2)

    is superfluous transformation to a different log base and dividing it by a constant. It is little more than a conversion factor of Boltzmann’s original equation.

    Worse, your lack of responses has demonstrated you can’t even calculate “missing information” most of the time without resorting to Claussius:

    dS = dQ/T

    despite your boasting that science has moved on. Indeed science has moved on, so it is all the more amazing the Claussius definition is still essential to computing Boltzman and Shannon microstates of W in most cases.

    You’ve yet to show you can compute Shannon for a simple system like a melting ice cube. That’s because your approach has next to no practical utility, like I said. If it did, you’d have provided a worked out example using your approach rather repeatedly embarrassing yourself with a lack of ability to calculate “missing information” without first resorting to Claussius definition that has no regard for your definition of entropy in terms of “missing information”.

    Go ahead, show a practical computation of entropy of a melting ice cube, a 1555 gram block of copper, the standard molar entropies of common chemical substances, the Gibbs free energy, the absolute entropy of a gas, etc. using your “missing information” approach. Many of these I worked out through energy dispersal, which is formally:

    dS = dQ/T

    which is more than I can say for what you and Mung and DNA_jock have done in this thread after almost 1,500 comments and counting.

Leave a Reply