In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Good grief, walto. That isn’t the only place you did it.

    Take responsibility for what you wrote. You made a mistake. I corrected you. You hate that, and you don’t want to admit your error, so you’re pretending that you meant something different all along.

    It won’t fly, and it’s a waste of time.

    Let me repeat my question from above:

    Sal, walto:

    Just to be absolutely clear: Do you (finally) agree with me that Lambert’s definition of entropy is incorrect, and that entropy is not a measure of energy dispersal?

    If we can agree on that, we can set dispersalism aside and focus on the viability of the MI view.

  2. walto,

    So I have no problem with using epistemic probabilities–the probability of some event occurring given some particular batch of prior information.

    Do you understand that “prior information” is observer-dependent?

    But again, the calculations using epistemic probabilities also seem to me to need to be constrained to be useful for Sal’s or Lambert’s purposes.

    Constrained how? Just plug the probabilities into the Gibbs equation and you’ll get an entropy. What additional constraints are necessary?

    What do you think about the constraint regarding RAM that Jock suggests above?

    It’s irrelevant, for two reasons:

    1. We’ve been conceiving of Damon as a Laplacean demon, which guarantees that he cannot be a part of the universe. He has to be outside, looking in, so any “RAM” constraints imposed by our universe don’t apply to him.

    2. Whether there actually is, or can be, a Damon-like observer is not of concern. What we care about is what such an observer would see if he existed. The answer: zero entropy.

  3. keiths: Constrained how? Just plug the probabilities into the Gibbs equation and you’ll get an entropy. What additional constraints are necessary?

    Removal of a constraint.

  4. Sal,

    Keiths said he can use all the equations with his information theory approach, but I pointed out, many of those equations have terms for ENERGY dispersal embedded in them because much of the world is seen through the lens of energy, and many of our thermodynamic instruments are oriented toward energy measurements of some sort (like thermometers), not measuring microsates or information.

    I keep explaining this to you, Sal. Thermodynamics involves energy, so of course energy factors into some of the equations. That doesn’t change the fact that entropy is not a measure of energy dispersal, for all the reasons I’ve already given.

  5. keiths: Whether there actually is, or can be, a Damon-like observer is not of concern. What we care about is what such an observer would see if he existed. The answer: zero entropy.

    So in the case of the mixing of distinguishable particles, what would this observer see? It seems that you are saying she would see the initial state, the final state, and all states in-between, but in calculating the entropy change, would arrive at the answer that there was no entropy change.

    By the way, I don’t think this demon who can see all is relevant to entropy, which only applies to thermodynamic systems, which are macroscopic. IOW, as soon as you do away with the macroscopic/microscopic distinction you’re no longer talking thermodynamic entropy.

  6. keiths:
    Good grief, walto.That isn’t the only place you did it.

    Take responsibility for what you wrote.You made a mistake.I corrected you.You hate that, and you don’t want to admit your error, so you’re pretending that you meant something different all along.

    It won’t fly, and it’s a waste of time.

    Let me repeat my question from above:

    Sal, walto:


    Just to be absolutely clear: Do you (finally) agree with me that Lambert’s definition of entropy is incorrect, and that entropy is not a measure of energy dispersal?

    If we can agree on that, we can set dispersalism aside and focus on the viability of the MI view.

    You’re funny.

    Annoying, but funny.

  7. keiths: I keep explaining this to you, Sal.

    Salvador is just being silly. He thinks he’s found a way to invalidate what you say and he’s sticking with it, no matter how absurd it is.

    Thermodynamics requires energy, therefore entropy cannot be a measure of the missing information. Q.E.D.

  8. keiths: What constraint?

    In the case of expansion from V to 2V it would be the barrier between the two compartments. Same in the case of mixing. Removal of a constraint.

  9. Mung,

    So in the case of the mixing of distinguishable particles, what would this observer see? It seems that you are saying she would see the initial state, the final state, and all states in-between,

    Yes.

    …but in calculating the entropy change, would arrive at the answer that there was no entropy change.

    Right, because the initial entropy was zero and the final entropy is zero.

    By the way, I don’t think this demon who can see all is relevant to entropy, which only applies to thermodynamic systems, which are macroscopic. IOW, as soon as you do away with the macroscopic/microscopic distinction you’re no longer talking thermodynamic entropy.

    The demon is just the limiting case. There’s a continuum of possible observers with differing amounts of missing information. As the missing information decreases, so does the entropy. When it hits zero (with Damon and his fellow demons), thermodynamics reduces to ordinary physics. Statistical mechanics becomes just plain mechanics, in other words.

  10. walto:

    So I have no problem with using epistemic probabilities–the probability of some event occurring given some particular batch of prior information.

    keiths:

    Do you understand that “prior information” is observer-dependent?

    walto:

    Of course.

    Well, the entropy is a function of the probabilities, and the probabilities are a function of the prior information, and you’ve just acknowledged that the prior information is observer-dependent.

    Connect the dots and you’ll see that entropy is observer-dependent.

  11. Sorry for the confusion, walto —
    My demon is Maxwellian, whereas keiths’s is Laplacean.
    I have no opinion whatsoever on the behavior of Laplacean demons.
    Popcorn time…

  12. keiths:
    walto:

    keiths:

    walto:

    Well, the entropy is a function of the probabilities, and the probabilities are a function of the prior information, and you’ve just acknowledged that the prior information is observer-dependent.

    Connect the dots and you’ll see that entropy is observer-dependent.

    Oy.

  13. walto,

    But again, the calculations using epistemic probabilities also seem to me to need to be constrained to be useful for Sal’s or Lambert’s purposes.

    keiths:

    Constrained how? Just plug the probabilities into the Gibbs equation and you’ll get an entropy. What additional constraints are necessary?

    Mung:

    Removal of a constraint.

    keiths:

    What constraint?

    Mung:

    In the case of expansion from V to 2V it would be the barrier between the two compartments. Same in the case of mixing. Removal of a constraint.

    You’re confusing two different kinds of constraint. The Gibbs equation can be used both before and after the barrier is removed. The probabilities will be different when the gases are distinguishable, but the same if they are indistinguishable. That’s why entropy changes in the former case but not the latter.

  14. walto:

    Oy.

    That’s not a very persuasive counterargument. Can you actually identify a problem with what I wrote?

  15. : 2LoT Alert!

    The precise statement of the second law is the following:

    In an isolated system (having a fixed energy, volume, and total number of particles), when we remove any internal constraint, say a partition between two compartments, the entropy will either increase or remain unchanged.

    – Ben-Naim (2016)

  16. As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

    My answer: Nowhere!

    Action has some interesting connections to entropy, and entropy has close connections to information…

    -Frank Wilczek

  17. When a system changes from one equilibrium state to another, the path of successive states through which the system passes is called a process.

    – Thermodynamics Demystified, p. 5.

    keiths: …my answer was that the spontaneous mixing of indistinguishable gases isn’t even a process to begin with, because the macrostate doesn’t change.

    And I think your answer is not correct, and that change of the macrostate has nothing to do with it. What say you?

  18. Mung:
    : 2LoT Alert!

    The precise statement of the second law is the following:

    In an isolated system (having a fixed energy, volume, and total number of particles), when we remove any internal constraint, say a partition between two compartments, the entropy will either increase or remain unchanged.

    – Ben-Naim (2016)

    According to Damon, it will always remain unchanged, whether that’s useful for steam engines or not.

  19. Mung: entropy has close connections to information…

    Yes, but I have “close connections” to my cat (whose entropy is not zero, I don’t care who says otherwise).

  20. walto: According to Damon, it will always remain unchanged, whether that’s useful for steam engines or not.

    Damon the Useless Demon?

    To be honest, I still don’t understand how two different observers come to two different conclusions about the temperature, pressure, volume, or number of particles in a thermodynamic system.

    Entropy is a state function. e.g., S(E, V, N). If the volume is doubled, S(E, 2V, N). Unless one or more of the variables are observer-dependent, I fail to see how S is observer-dependent.

    If an observer has incorrect information about he value of one of the variables, I still don’t understand how it follows that S is observer-dependent.

  21. The term “entropy” was coined by Rudolf Clausius in his famous 1865 paper in “Poggendorf’s Annalen.” Clausius wanted to express by a phenomenological term the capacity of a macroscopic system for internal microscopic alteration. He called it the “Verwandlungsgehalt” (changeability content) of matter.

    – Manfred Eigen

    The definition given by Clausius equates the increase of entropy with the “reversible” supply of heat at a given temperature, divided by that temperature.

    – Manfred Eigen

  22. Mung, quoting Ben-Naim:

    In an isolated system (having a fixed energy, volume, and total number of particles), when we remove any internal constraint, say a partition between two compartments, the entropy will either increase or remain unchanged.

    See the “increase or remain unchanged” part? If we’re talking about the mixing of distinguishable gases, it increases. If they’re indistinguishable, it doesn’t change.

    As I said:

    You’re confusing two different kinds of constraint. The Gibbs equation can be used both before and after the barrier is removed. The probabilities will be different when the gases are distinguishable, but the same if they are indistinguishable. That’s why entropy changes in the former case but not the latter.

  23. Temperature is, so to speak, the “intensity” of heat or of the underlying thermal motion, a measure of the average kinetic energy related to a degree of freedom of motion. Entropy is the “extensive” complement which could be expressed in dimensionless units, as long as reference to the amount of matter involved is made.

    – Manfred Eigen

  24. Mung,

    Do you have a more precise statement of the second law?

    Why? I’m not disagreeing with Ben-Naim. I’m pointing out that my statement is perfectly compatible with his:

    You’re confusing two different kinds of constraint. The Gibbs equation can be used both before and after the barrier is removed. The probabilities will be different when the gases are distinguishable, but the same if they are indistinguishable. That’s why entropy changes in the former case but not the latter.

    As he said:

    …when we remove any internal constraint, say a partition between two compartments, the entropy will either increase or remain unchanged.

    [Emphasis added]

  25. Mung:

    When a system changes from one equilibrium state to another, the path of successive states through which the system passes is called a process.

    – Thermodynamics Demystified, p. 5.

    keiths:

    …my answer was that the spontaneous mixing of indistinguishable gases isn’t even a process to begin with, because the macrostate doesn’t change.

    Mung:

    And I think your answer is not correct, and that change of the macrostate has nothing to do with it. What say you?

    You’re getting confused by the lack of a micro- or macro- prefix on the word “state”. The author is talking about macrostates.

    Here’s your clue: In a container full of gas, is there an equilibrium macrostate? Yes. When the system is in that equilibrium macrostate, is there a corresponding equilibrium microstate? No, because the microstate is constantly changing.

    Now read the quote again:

    When a system changes from one equilibrium state to another, the path of successive states through which the system passes is called a process.

  26. Mung,

    To be honest, I still don’t understand how two different observers come to two different conclusions about the temperature, pressure, volume, or number of particles in a thermodynamic system.

    They don’t. Entropy is a function of the macrostate, so if two observers choose the same macrostate, they’ll see the same entropy value. There’s more than one way to choose a macrostate, though, as my Xavier/Yolanda/Damon thought experiment shows.

    Here’s an explanation from earlier in the thread:

    I think you missed this:

    Every macrostate corresponds to an ensemble of microstates. Xavier’s macrostate corresponds to lots of microstates. Yolanda’s corresponds to fewer. Damon’s corresponds to exactly one.

    For Damon the macrostate and the actual microstate are identical. For Xavier and Yolanda they are not.

    Think of it this way: the macrostate is essentially a summary of what you know about the system’s microstate.

    Let’s set thermodynamic entropy aside for the moment and talk about what I’ve been calling the “logical” entropy of a deck of cards.

    For the deck of cards, the microstate is the exact sequence of the cards. Given a deck of cards with the numbers 1 through 5 printed on them, the microstate of the deck might be the sequence 4 1 3 5 2. If the cards happen to be ordered, the microstate is 1 2 3 4 5. If they’re reverse-ordered, it’s 5 4 3 2 1.

    There are 5! possible microstates for the deck — that is, 5 x 4 x 3 x 2 x 1 possible sequences, for a total of 120.

    If the only thing I know about the deck is that it’s been randomly shuffled, then the macrostate is basically that single fact. The macrostate is “randomly shuffled”. The microstate might be any of the 120 possible sequences. We don’t know which one actually obtains.

    In other words, the macrostate “randomly shuffled” corresponds to an ensemble of 120 microstates.

    Now suppose that instead of being randomly shuffled, the deck has been prepared this way:

    1) the odd cards have been separated from the evens;
    2) the two “subdecks” have been separately shuffled; and
    3) the “odd” subdeck has been placed atop the “even” subdeck.

    Let’s call this the “odds before evens” macrostate.

    For this macrostate, the number of possible microstates is no longer 120. Some of the sequences have been ruled out, such as this one: 1 4 5 3 2. It doesn’t satisfy the “odds before evens” criterion. There are only 12 sequences that do.

    In other words, the macrostate “odds before evens” corresponds to an ensemble of 12 microstates.

    Finally, suppose you know that the deck has been prepared by placing the cards in increasing order. The macrostate is “increasing order”, and there is only one possible microstate: 1 2 3 4 5.

    In other words, the macrostate “increasing order” corresponds to an ensemble of just one microstate.

    Extra credit: Think about what happens in all of these scenarios if the cards are indistinguishable; for instance, if the cards all have the number 2 printed on them, with no other distinguishing features.

  27. keiths: I’m not disagreeing with Ben-Naim. I’m pointing out that my statement is perfectly compatible with his:

    Then you agree with his comment about the removal of an internal constraint.

  28. keiths: You’re getting confused by the lack of a micro- or macro- prefix on the word “state”. The author is talking about macrostates.

    So? The author is talking about paths. Are you claiming there can be no path from one macrostate to another macrostate?

    You claimed that “the spontaneous mixing of indistinguishable gases isn’t even a process to begin with.” Your reasoning was, “because the macrostate doesn’t change.”

    My rebuttal was that whether the macrostate changes or not is irrelevant. Whether or not it is a process depends on the path.

    Your only option is to deny that there is any change from one equilibrium state to another equilibrium state.

  29. An entropy can be assigned to any probability distribution.

    – Manfred Eigen

    Oh my, a hyper informationalist!

    On the Shannon sense of “entropy,” sure. But not all probability distributions are relevant to thermodynamics.

  30. keiths: There’s more than one way to choose a macrostate, though, as my Xavier/Yolanda/Damon thought experiment shows.

    Which choice did Damon make?

  31. Mung,

    Then you agree with his [Ben-Naim’s] comment about the removal of an internal constraint.

    Yes. As I said:

    You’re confusing two different kinds of constraint. The Gibbs equation can be used both before and after the barrier is removed. The probabilities will be different when the gases are distinguishable, but the same if they are indistinguishable. That’s why entropy changes in the former case but not the latter.

    [Emphasis added]

  32. keiths: There’s more than one way to choose a macrostate, though, as my Xavier/Yolanda/Damon thought experiment shows.

    How many ways to choose a macrostate are there?

    Let me put this another way. When a thermodynamic system is at equilibrium, how many macrostates exist for each observer to choose between?

  33. keiths:

    You’re getting confused by the lack of a micro- or macro- prefix on the word “state”. The author is talking about macrostates.

    Here’s your clue: In a container full of gas, is there an equilibrium macrostate? Yes. When the system is in that equilibrium macrostate, is there a corresponding equilibrium microstate? No, because the microstate is constantly changing.

    Mung:

    So? The author is talking about paths. Are you claiming there can be no path from one macrostate to another macrostate?

    Of course not. Where do you get these odd ideas?

    You claimed that “the spontaneous mixing of indistinguishable gases isn’t even a process to begin with.” Your reasoning was, “because the macrostate doesn’t change.”

    Yes, and that’s correct. It isn’t a process, and the definition you quoted confirms that. For it to be a process, the equilibrium state — the macrostate — would have to change. It doesn’t.

    My rebuttal was that whether the macrostate changes or not is irrelevant. Whether or not it is a process depends on the path.

    The path in question is a path between macrostates, so of course it matters whether the macrostate changes.

    Your only option is to deny that there is any change from one equilibrium state to another equilibrium state.

    In the case of indistinguishable gases, there is no change from one equilibrium state to another. No path, in other words. The macrostate remains unchanged. No process, just a thermodynamically static system.

    When the gases are distinguishable, there is a change from one equilibrium state to another. There is a path, in other words. The macrostate changes. It’s a process.

    Read your quote again, focusing on the words in bold:

    When a system changes from one equilibrium state to another, the path of successive states through which the system passes is called a process.

    – Thermodynamics Demystified, p. 5.

  34. So just another failed attempt at mind-reading. Got it.

    Mung, you were doing so well in this thread. Don’t let it all unravel.

    I can tell that you are confused about constraints by reading what you wrote.

  35. Manfred Eigen:

    An entropy can be assigned to any probability distribution.

    Mung:

    On the Shannon sense of “entropy,” sure. But not all probability distributions are relevant to thermodynamics.

    Eigen is right. Entropy is the superset, and thermodynamic entropy is a subset. It’s just one kind of entropy.

    Thermodynamic entropy came first historically, but that’s not a good reason to grant it exclusive rights to the word “entropy” without the qualifier.

  36. keiths:

    There’s more than one way to choose a macrostate, though, as my Xavier/Yolanda/Damon thought experiment shows.

    Mung:

    Which choice did Damon make?

    He chose to determine the exact microstate. His macrostate has an ensemble of one single microstate.

  37. Mung,

    How many ways to choose a macrostate are there?

    That depends on the system, but in general, lots.

    Let me put this another way. When a thermodynamic system is at equilibrium, how many macrostates exist for each observer to choose between?

    Lots, if you’re talking about a typical system such as the gas-mixing setups we’ve been discussing.

    Think of it this way: Every macrostate is associated with a set of possible microstates. When you establish a macrostate, you’re essentially saying “I know that the actual microstate must be one of these M microstates.” The less information you have about the system, the bigger M is. The more information you have, the smaller M is.

    If you’re Damon the Demon, then M is 1. There’s only one epistemically possible microstate, so the entropy is zero.

  38. keiths: I can tell that you are confused about constraints by reading what you wrote.

    Let me describe to you what took place, and then you can choose to accept or reject my account.

    walto mentioned constraints.

    I thought perhaps walto was confused and offered an alternative reading of what he may have been talking about.

    You asserted I was confused.

    I then, later, posted a quote from Ben-Naim about the second law which mentioned internal constraints.

    You interpreted this as somehow being responsive to some prior claim you had made, and asserted again that I was confused.

    You assumed, incorrectly, that the 2LoT quote from Ben-Naim was intended to be in some way a response to your earlier claims about my alleged confusion.

  39. keiths: Eigen is right. Entropy is the superset, and thermodynamic entropy is a subset. It’s just one kind of entropy.

    Eigen is wrong. SMI is the superset and thermodynamic entropy is a subset of SMI. Thermodynamic entropy is the only kind of entropy.

    Thermodynamic entropy came first historically, but that’s not a good reason to grant it exclusive rights to the word “entropy” without the qualifier.

    Thermodynamic entropy came first historically, and that’s a good reason to grant it exclusive rights to the word “entropy,” how it is used in thermodynamics.

  40. Mung:

    Thermodynamic entropy is the only kind of entropy.

    Says the guy who just mentioned Shannon entropy.

  41. keiths: He [Damon] chose to determine the exact microstate. His macrostate has an ensemble of one single microstate.

    I don’t know what you mean when you claim that Damon chose “the exact microstate.”

    Is it the case that his macrostate has an ensemble of one single microstate because of his choice of a microstate?

    If so, why is the same not true of X and Y?

    Or is it the case that his macrostate has an ensemble of one single microstate because he had no choice of a microstate?

    What choice did Damon have?

  42. Mung,

    I don’t know what you mean when you claim that Damon chose “the exact microstate.”

    He didn’t choose the microstate. Read it again, noting the words in bold:

    He chose to determine the exact microstate. His macrostate has an ensemble of one single microstate.

Leave a Reply