In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. I also liked Larry Moran’s remark. It made clear from the outset of this thread that this dispute is 1) old news, and 2) not terribly important.

  2. And now some more humiliation for DNA_Jock:

    ΔH = Lf = L_effecitve = = L_f(Tm) – Integral [(c_w – c-I) dT]

    So what did DNA_jock have to say about ΔH?

    ΔH will vary linearly with T

    DNA_Jock

    If that were true as DNA_Jock claims, then the graph of L_effective (which is the theoretical value of ΔH from the Kirchoff relations of thermochemistry) would be a straight line.

    I told DNA_Jock from the very first that was a ridiculous claim since ΔH vs. T won’t be a straight line. So now I invite the readers to examine for themselves if the L_effective curve (which is the theoretical ΔH curve) is in fact a straight line. I took the liberty of drawing a straight solid blue line to contrast it with the curved solid black line of L_effective (the theoretical ΔH from Kirchoff relations). Is the L_effective curve a straight line? I think not.

    Of course the painful way to demonstrate this is to take the first derivative with respect to T and see if we get a constant. Not likely.

    ΔH = Lf = L_effecitve = = L_f(Tm) – Integral [(c_w – c-I) dT]

    But we don’t have to go through that agony, since a picture is worth a thousand words and we can clearly see the L_effecitve curve is not a straight line, therefore DNA_jock’s claim ” ΔH will vary linearly with T” is false.

    Talk about bringing a gun to knife fight and then shooting yourself in the head DNA_Jock. Too funny. Ah, the gift that keeps on giving. Hehehe!

  3. stcordova: And now some more humiliation for DNA_Jock:

    ΔH = Lf = L_effecitve = = L_f(Tm) – Integral [(c_w – c-I) dT]

    So what did DNA_jock have to say about ΔH?

    ΔH will vary linearly with T

    DNA_Jock

    If that were true as DNA_Jock claims, then the graph of L_effective (which is the theoretical value of ΔH from the Kirchoff relations of thermochemistry) would be a straight line.

    Oh Sal, I never made any such claim. I wrote:

    Cool calculation, bro.
    But what if the temperature is, say, 250 K ?
    (This is an important question in meteorology.)
    You could
    A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.
    Or
    B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)

    to which you responded:

    The ΔH in question is dispersal of energy from the surroundings to melt the ice, and that is different from the ΔH of a non-existent chemical reaction involving reactants and products and that are used as inputs to Kirchhoff’s laws.

    Each and every time that you state this blatant falsehood, it makes you look bad.

  4. stcordova: And now some more humiliation for DNA_Jock:…
    …Talk about bringing a gun to knife fight and then shooting yourself in the head DNA_Jock. Too funny. Ah, the gift that keeps on giving. Hehehe!

    Come on, Sal. DNA_Jock strikes me as taking a generally reasonable approach with you. I’m sure communication would improve if you could resist succumbing to these little outbursts of hubris.

  5. stcordova: The problem with entropy is that unlike other physical properties like length, volume, velocity, mass (weight), acceleration, temperature, etc. — there is nothing in our everyday experience to familiarize it to our intuitions.

    I disagree with Salvador. If you adopt the information theory approach there is something in our everyday experience to familiarize it [entropy] to our intuitions.

  6. stcordova: But even though Boltzmann is the most comprehensive and arguably the most important in the scheme of things (with it’s definition of entropy in terms of microstates of a collection of molecules) from a practical standpoint, all most no one goes around counting the number of microstates in a system.

    I disagree with Salvador. I think he takes an overly simplistic view of counting in this context. It’s not something we do on our fingers.

  7. stcordova: If that’s the case for Keiths, then imho, the “missing information” model is largely superfluous and not fit for teaching.

    I disagree with Salvador. I’ve posted a number of texts in this thread that use the information theory approach. It’s been in use for decades.

  8. Sal,

    As metaphors go, there is never a “right” answer, only which ones help conceive of things best. Lambert “energy dispersal”. Lord Kelvin himself used the word “dissipation”. Dr. Mike says “spreading around of energy” to describe the 2nd law. Keiths wants to define it as “missing information”.

    Those aren’t metaphors. The people who take the “disorder” view believe that entropy is an actual measure of actual disorder. The people who take the “energy dispersal” view, including Lambert and “Dr. Mike”, believe that entropy is an actual measure of the actual dispersal of actual energy. The people who take the “missing information” view, including me, hold entropy to be an actual measure of the actual information gap between the macrostate and the microstate.

    They aren’t metaphors; they are characterizations of entropy.

    The first two characterizations are incorrect. Entropy is not a measure of disorder, and it is not a measure of energy dispersal. It is a measure of missing information.

    You rejected the “entropy as disorder” view in your OP, using words like “Gag!” and “Choke!”, and urged us to embrace the “energy dispersal” view instead, not realizing that the latter was mistaken.

    In other words, you unwittingly urged us to replace one misconception with another. Don’t you see the irony, particularly when there is a third alternative — the missing information view — that is actually correct?

  9. Sal,

    Entropy, as far as practice is concerned, is expected to be an objective property of a system much like length, volume, temperature, etc.

    Entropy is an objective property of the macrostate, but the macrostate itself is observer-dependent.

    To the extent that their macrostates match, people will agree on the entropy value. To the extent that their macrostates mismatch, they will disagree.

    When scientists agree on the macrostate, as they do when they compile a table of standard molar entropies, for instance, they will get the same entropy values.

  10. walto,

    The definition that keiths is so keen that you use simply notes that disorder is best understood as uncertainty as to various parameters of our choice, or, in other words, absence of information that would determine values for all the variables in question.

    No, not at all. Disorder and uncertainty are separate concepts.

    Entropy is a measure of uncertainty (or equivalently, of missing information). It is not a measure of disorder.

  11. walto, today:

    I don’t think it makes sense to call microstates or macrostates true or false, accurate or inaccurate.

    walto, Tuesday:

    There is one and only one correct macrostate.

    You have a lot of confusion to sort through, walto.

  12. 500 coins all heads up on a table is more ordered than 500 coins tossed in the air. I think this is one of those things Sal is trying to teach budding young IDists.

  13. In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system.

    In condensed matter physics, systems typically are ordered at low temperatures; upon heating, they undergo one or several phase transitions into less ordered states.

    Examples for such an order-disorder transition are:

    the melting of ice: solid-liquid transition

    https://en.wikipedia.org/wiki/Order_and_disorder

    I just had to laugh at that one.

  14. keiths: There is one and only one correct macrostate.

    Based on your last two posts you understand even less than I thought. And that wasn’t much to begin with!

  15. keiths: You have a lot of confusion to sort through, walto.

    Yes, you are extremely confused. But I’m tired of trying to sort through this for you.

  16. Which is it, walto?

    walto, today:

    I don’t think it makes sense to call microstates or macrostates true or false, accurate or inaccurate.

    walto, Tuesday:

    There is one and only one correct macrostate.

  17. Mung: In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system.

    Right. The absence of “some symmetry or correlation.” Exactly. That’s what would allow for the determination of the the exact microstate. Explain this to keiths, will you? I’m too tired.

  18. keiths:
    Which is it, walto?

    walto, today:

    walto, Tuesday:

    That you think there’s a contradiction there is exactly what you’re missing. It’s funny. But as I told both you and mung, I’m too tired to spend any more time with your silliness.

  19. keiths:

    Now, instead of admitting your mistake, you are trying to pretend that you “changed the problem” and that you were right all along.

    Mung:

    This is false.

    I’ve never claimed I’ve been “right all along.”

    Sure you did, just yesterday:

    I got the initial calculation right, and for the same problem I’d still get that result. I changed the problem, which is something I’ve repeatedly told you and you just continue to ignore.

    Mung:

    In fact, what I said was, I made a mistake.

    And then you contradicted yourself, claiming that you were right all along, as I just showed. You’re making stuff up as you go, trying to hide your mistakes.

    Your childishness is tiring and a waste of time, Mung. You made a mistake. I pointed it out. Deal with it and move on.

  20. Mung, quoting Wikipedia:

    In physics, the terms order and disorder designate the presence or absence of some symmetry or correlation in a many-particle system.

    walto:

    Right. The absence of “some symmetry or correlation.” Exactly. That’s what would allow for the determination of the the exact microstate. Explain this to keiths, will you? I’m too tired.

    No, walto. The presence or absence of symmetry or correlation is not what allows for the determination of the exact microstate.

    Flip 500 distinguishable coins randomly, without looking at the result. The entropy — the amount of additional information needed to determine the exact microstate — is the same regardless of how the coins land. There are 2^500 possible microstates, each with the same epistemic probability. The entropy is therefore

    S = log2 2^500 = 500 bits

    …and that remains true whether the microstate is highly symmetric and correlated — say, all heads — or a jumbled and disordered mix of heads and tails.

  21. walto, Sal, Mung:

    Here’s why entropy is observer-dependent:

    1. Entropy is a function of the macrostate.

    2. The macrostate depends on what the observer knows about the exact microstate.

    3. Different observers can possess different information about the exact microstate.

    4. Therefore, macrostates are observer-dependent. Two observers of the same system can have different macrostates.

    5. Since entropy depends on the macrostate, it is also observer-dependent.

    If you disagree, then let’s hear your analysis of my earlier scenario involving a pair of dice:

    A blindfolded walto randomly throws a pair of fair dice, one red and one green. The exact microstate is some ordered pair (r, g), where r is the number on the red die and g is the number on the green.

    Like walto, I am blindfolded. Neither of us has any information about the result of the dice throw.

    An honest, accurate witness whispers to walto that the sum of the dice is greater than eight. To me she whispers that the sum of the dice is greater than ten.

    For walto, the ten possible microstates are
    (3,6)
    (4,5), (4,6)
    (5,4), (5,5), 5,6)
    (6,3), (6,4), (6,5), (6,6)

    For me, the three possible microstates are
    (5,6)
    (6,5), (6,6)

    We have different macrostates, but both are correct. It is true that the sum is greater than eight, and it is also true that the sum is greater than ten.

    Based on his correct macrostate, walto computes an entropy of

    S = log2 10 ≈ 3.32 bits

    Based on my correct macrostate, I compute an entropy of

    S = log2 3 ≈ 1.58 bits

    Both of us are correct. The macrostates are observer-dependent, and so is the entropy.

  22. Probability theory as logic shows how two persons, given the same information, may have their opinions driven in opposite directions by it, and what must be done to avoid this.

    – E.T. Jaynes

  23. An argument which makes it clear intuitively why a result is correct is actually more trustworthy, and more likely of a permanent place in science, than is one that makes a great overt show of mathematical rigor unaccompanied by understanding.

    – E.T. Jaynes

  24. Mung:
    An argument which makes it clear intuitively why a result is correct is actually more trustworthy, and more likely of a permanent place in science, than is one that makes a great overt show of mathematical rigor unaccompanied by understanding.

    – E.T. Jaynes

    Indeed. Math is just modelling.

  25. keiths: If you disagree, then let’s hear your analysis of my earlier scenario involving a pair of dice:

    I’d love to answer your questions. I’d also love for you to answer mine. What do you say? Seems fair to me.

  26. keiths: An honest, accurate witness whispers to walto that the sum of the dice is greater than eight. To me she whispers that the sum of the dice is greater than ten.

    What is her macrostate? Is it also the correct macrostate?

    ETA: It seems to me that in this scenario we only have one observer. What exactly are you and walto observing?

  27. Macrostate : Microstates : Haw Entropy

    (1,1) : (1,1) : 0
    (1,2) : (1,2) : 0
    (1,3) : (1,3) : 0
    (1,4) : (1,4) : 0
    (1,5) : (1,5) : 0
    (1,6) : (1,6) : 0
    (2,1) : (2,1) : 0
    (2,2) : (2,2) : 0
    (2,3) : (2,3) : 0
    (2,4) : (2,4) : 0
    (2,5) : (2,5) : 0
    (2,6) : (2,6) : 0
    (3,1) : (3,1) : 0
    (3,2) : (3,2) : 0
    (3,3) : (3,3) : 0
    (3,4) : (3,4) : 0
    (3,5) : (3,5) : 0
    (3,6) : (3,6) : 0
    (4,1) : (4,1) : 0
    (4,2) : (4,2) : 0
    (4,3) : (4,3) : 0
    (4,4) : (4,4) : 0
    (4,5) : (4,5) : 0
    (4,6) : (4,6) : 0
    (5,1) : (5,1) : 0
    (5,2) : (5,2) : 0
    (5,3) : (5,3) : 0
    (5,4) : (5,4) : 0
    (5,5) : (5,5) : 0
    (5,6) : (5,6) : 0
    (6,1) : (6,1) : 0
    (6,2) : (6,2) : 0
    (6,3) : (6,3) : 0
    (6,4) : (6,4) : 0
    (6,5) : (6,5) : 0
    (6,6) : (6,6) : 0

    Average HawEntropy = 0

    There’s something wrong with this picture.

    What would the average entropy be for keiths and walto?

  28. Muing,

    I’d love to answer your questions. I’d also love for you to answer mine. What do you say? Seems fair to me.

    The journey needs to be guided by someone who knows the territory, not someone who is lost. You are confused about entropy, and I’m trying to lead you to understanding.

    When you ask questions that I judge to be useful and relevant, I’ll answer them. If you challenge my position with an actual argument, I’ll respond. Otherwise, no, I’m not interested in answering every question you raise. There’s a reason the teacher sets the syllabus, not the student.

  29. keiths:

    An honest, accurate witness whispers to walto that the sum of the dice is greater than eight. To me she whispers that the sum of the dice is greater than ten.

    Mung:

    What is her macrostate?

    She knows the exact microstate, whatever it happens to be.

    Is it also the correct macrostate?

    There is no such thing as “the correct macrostate”. That’s walto’s mistake.

    Her macrostate is a correct macrostate, not the correct macrostate.

    Since she knows the exact microstate, only one microstate is epistemically possible for her. She sees an entropy of

    S = log2 1 = 0 bits

    ETA: It seems to me that in this scenario we only have one observer. What exactly are you and walto observing?

    Walto and I are observing the system. It’s just that we’re observing it indirectly via the witness. From our perspective, she is an indirect source of information about the system, just as a thermometer might be an indirect source of information in a thermodynamic scenario.

    She is also an observer in her own right, of course.

  30. keiths:
    keiths:

    What is her macrostate?

    She knows the exact microstate, whatever it happens to be.

    There is no such thing as “the correct macrostate”.That’s walto’s mistake.

    Her macrostate is a correct macrostate, not the correct macrostate.

    Since she knows the exact microstate, only one microstate is epistemically possible for her.She sees an entropy of

    S = log2 1 = 0 bits

    Walto and I are observing the system.It’s just that we’re observing it indirectly via the witness. From our perspective, she is an indirect source of information about the system, just as a thermometer might be an indirect source of information in a thermodynamic scenario.

    She is also an observer in her own right, of course.

    What a load of utter bullshit, oh wise guru. Keep leading him!

  31. walto,

    Your position makes no sense. If the “one correct macrostate” were the one corresponding to a single microstate — the actual microstate of the system — then the entropy would always be zero.

  32. keiths: If the “one correct macrostate” were the one corresponding to a single microstate — the actual microstate of the system — then the entropy would always be zero.

    And so it is.

    My answer is the same as walto’s. The correct macrostate is the one observed by your “honest, accurate witness.” After all, she must have some basis in fact for telling walto that the sum is greater than eight and telling you that the sum is greater than ten, and this can only come from knowing the precise macrostate.

  33. So according to you, entropy is always zero, rendering it useless.

    That’s a hint that there’s something very wrong with your understanding of entropy.

  34. Entropy is a measure of missing information, remember? If she’s got all of it, there’s none missing.

  35. walto,

    Entropy is a measure of missing information, remember? If she’s got all of it, there’s none missing.

    Right. She isn’t missing any information about the microstate, so the entropy is zero for her. She knows the exact microstate, so

    S = log2 (1) = 0 bits

    I’m missing some information about the microstate. I know that the sum is greater than ten, so I can narrow things down to three possible microstates, but I don’t know which of the three is the actual microstate. I see an entropy of

    S = log 2 (3) ≈ 1.58 bits

    You are missing the most information about the microstate. You know only that the sum of the dice is greater than eight. So for you, the entropy is higher than for me — you can only narrow things down to ten possible microstates. You see an entropy of

    S = log 2 (10) ≈ 3.32 bits

    All three macrostates are correct, and all three entropy values are correct. They differ because each of us is missing a different amount of information about the exact microstate.

    Macrostates are observer-dependent because the missing information can differ from observer to observer. Entropy is a function of the macrostate, so it too is observer-dependent.

    Your mistake is in thinking that there is “one correct macrostate”. There isn’t.

    The witness knows the exact microstate, and she’s right about that. I know that the sum is greater than ten, and I’m right about that. You know that the sum is greater than eight, and you’re right about that. All three macrostates are correct, and so are the corresponding entropy values.

    Macrostates are observer-dependent, and therefore so is entropy.

  36. One problem is that no one knows the precise one and only correct microstate from the macrostate, not even an honest accurate witness.

    How does Ms. Haw know all the possible macrostates, given that she only knows the exact microstate? Who told her that dice sums were legitimate macrostates? Why are dice sums the only legitimate macrostates?

  37. Mung,

    One problem is that no one knows the precise one and only correct microstate from the macrostate, not even an honest accurate witness.

    Sure they do. Any normally-sighted person can discern the microstate in this case. All you have to do is read the numbers off the dice and distinguish red from green. Any person capable of doing so can be a Damon in the world of dice entropy, in other words. If I removed my blindfold I could do so as well, and so could walto, assuming that he is capable of reading dice numbers and distinguishing red from green.

    How does Ms. Haw know all the possible macrostates, given that she only knows the exact microstate?

    She doesn’t need to know all the possible macrostates, and in any case I am assuming that she’s smarter than you.

    Who told her that dice sums were legitimate macrostates?

    Again, I’m assuming that she’s smarter than you and was able to figure that out on her own.

    Why are dice sums the only legitimate macrostates?

    They aren’t, as I’ve already explained.

  38. keiths:

    Your mistake is in thinking that there is “one correct macrostate”. There isn’t.

    walto:

    Yes, there is.

    If you believe that, then you don’t grasp the concept of a macrostate.

    To understand entropy, you need to understand macrostates.

  39. walto: To understand entropy, you need to understand knowledge.

    To understand knowledge you need to understand truth. 😉

Leave a Reply