In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. walto,

    Who is right, the walto who agrees with Jaynes…

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for.

    …or the walto who disagrees with Jaynes?

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    Hint: the walto who agrees with Jaynes is correct.

  2. This book discusses the proper definitions of entropy, the valid interpretation of entropy and some useful applications of the concept of entropy. Unlike many books which apply the concept of entropy to systems for which it is not even defined (such as living systems, black holes and the entire universe), these applications will help the reader to understand the meaning of entropy. It also emphasizes the limitations of the applicability of the concept of entropy and the Second Law of Thermodynamics. As with the previous books by the author, this book aims at a clear and mystery-free presentation of the central concept in thermodynamics the entropy. In this book, the concepts of entropy and the Second Law are presented in a friendly, simple language.

  3. colewd:

    How would you calculate it with the Boltzman equation.

    To the best of my knowledge, the Shomate equation is an empirically based equation, that is to say, it is rooted in Clausius!

    To the best of my knowledge, no one does analysis on this scale ( a mole is Avogadro’s number which is 6.022 x 10^23) with Boltzmann’s equations, it is accepted on faith that solving for the Clausius form solves the Boltzmann values.

    Again, the Boltzmann equation isn’t necessarily one used lot in practice, it was basically saying if atoms exist then atoms can explain the phenomenon of temperature and heat. It was landmark at the time of Boltzmann because many people did not believe in atoms in the time of Boltzmann, and Boltzmann was a proponent of atomic theory. Botlzmann solved a different set of problems than Clausius.

    Clausius solved problems of steam engines and chemistry and air conditioners without any statement about atoms. Boltzmann explained how thermodynamics could work if atoms existed. Boltzmann was answering another set of questions, so to speak. He tried to provide a theory of temperature and heat in terms of atoms. That was very ground breaking at the time because many scientists resisted the idea of atoms and did not think heat and temperature could be explained as statistical averages of the motion of atoms.

    So Boltzmann isn’t revered so much because he provided easier ways to calculate entropy, as much as he showed entropy is a statistical property of a large number of atoms.

    So bottom line, I don’t think anyone for systems like boiling water in a kettle perform entropy calculations using Boltzmann.

  4. colewd,

    To answer your question again about using Boltzmann to calculate entropy, the best example I know of doesn’t even give exact results, namely the value of entropy for an ideal monoatomic gas. The problem is ideal monatomic gases don’t exist in reality!

    I’ve used the approximation many times in this thread based on Boltzmann, namely the Sakur-Tetrode equation:

    https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation

    But if one wants exact values, one uses Clausius! The shomate equation is mostly an empirically based equation using measurements, and then they do a “curve fit” with some polynomical. My understanding is that the fancy looking Shomate equation is more of a curve fit than some theoretically derived formula.

    As an aside, the liquid state is monstorously difficult to analyze using Boltzmann. Ideal gases are the easiest, then non-ideal gases, then crystal solids. Liquids are mostly intractable by direct analysis.

  5. Mung,

    Which is nonsense, because entropy is a state function, and no one knows the thermodynamic state of a black hole, just as no one knows the state of the entire universe, or the universe when (if) it had a beginning, or of the universe when (if) it ends.

    A second reason to share a pbj 🙂

  6. walto:

    I think it would be better to say that thermodynamic entropy is a subset of Shannon entropy. If you haven’t read Adami’s blog, he’s very good on explaining that thermodynamic entropy just IS a type of missing information.

    I don’t have huge issues about what is what, but the fact everyone is trying redefine the meaning of the 2nd law and Boltzmann adds a little murkiness.

    The simplest statement of the 2nd law was by Clausius:

    No process is possible whose sole result is the transfer of heat from a cooler to a hotter body.

    http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node37.html

    That’s why cold objects cool heat up in a hot environment naturally. The only way to do otherwise is to have an refrigerator, but then this happens because their is a consumption of energy in the process and hence the process does not have a “sole result” of heat energy going from a cold object to a hot environment — energy must be expended.

    When the 2nd law began to be expanded beyond Clausius simple definition, it causes ambiguity as to what entropy really means. When the 2nd law is framed simply, the definition of thermodynamic entropy is framed simply.

    Boltzmann’s original formula:

    S = kB ln W

    conceived of microstates in terms of all the 6-dimesional coordinates of the molecules in a system. The 6 dimensions being composed of the 3 spatial dimensions along X, Y, Z in Euclidian space and the 3 momentum dimensions of each particle P_x, P_y, P_z in “momentum space” for a total of 6-dimensions.

    Now Boltzmann got a little lucky because this 6-dimensional space is an idealized Newtonian view of molecules, but it is close enough to the quantum mechanical view for the properties of interest. The modern notion of thermodynamic microstate needs quantum mechanics, but Newtonian (classical) mechanics is often good enough.

    However, when one is talking about head/tails of coins and the entropy associated with this, usually one isn’t thinking of microstates in terms of individual molecules in the coins, but the heads/tails configuration. Shannon entropy allows picking of what the observer deems appropriate as a microstate. Boltzmann thermodynamics restricts the microstates to being the set of X,Y,Z, P_x, P_y, P_z coordinates of the molecules, and modern quantum mechanically based thermodynamics restricts the microstates to quantum microstates (ironically, sometimes actually easier than the Newtonian approach of Boltzmann).

    I gave a discussion of this here a few years ago:

    Shannon Information, Entropy, Uncertainty in Thermodynamics and ID

    This essay is intended to give a short overview of textbook understanding of Shannon Information, Entropy, Uncertainty in thermodynamics and Intelligent Design. Technical corrections are welcome.

    The phrases “Shannon Information”, “Shannon Uncertainty”, “Shannon Entropy” are all the same. The most familiar every day usage of the notion of Shannon Information is in the world of computing and digital communication where the fundamental unit is the bit.

    When someone transmits 1 megabit of information, the sender is sending 1 megabit of Shannon information, and the receiver is getting a reduction of 1 megabit of Shannon Uncertainty, and the ensemble of all possible configurations of 1 megabits has 1 million degrees of freedom as described by 1 megabit of Shannon entropy.

    The word “Information” is preferred over “Uncertainty” and “Entropy” even though as can be seen they will yield essentially the same number.

    The probability of any given configuration (microstate) can be converted by the following formula:

    2^-I = P

    conversely

    I = -log2(P)

    Where I is the information measure in bits and P is the probability of finding that particular configuration (microstate).

    Examples:
    The Shannon information of 500 fair coins heads is 500 bits, and the probability of a given configuration of the 500 coins is

    2^-I = 1 out of 2500

    But there are subtleties with the 500 coin example. We usually refer to the Shannon entropy in terms of heads/tails configuration. But there are other symbols we could apply Shannon metrics to.

    For example, we could look at the 4 digit year on a coin and consider the Shannon entropy associated with the year. If we use an a priori assumption of all possible years as equally probable, I calculate the Shannon Entropy as:

    log2 (10^4)= 13.28771 bits

    But this assumption is clearly too crude since we know not all years are equally probable!

    Further we could consider the orientation angle of a coin on a table based on the rounded whole number of degrees relative to some reference point. Thus each coin has 360 possible orientation microstates corresponding to 360 degrees. The number of bits is thus:

    log2(360) = 8.49185 bits

    Notice, the problem here is we could choose even finer resolution of the orientation angle to 0.1 degree and thus get 3600 possible microstates! In that case, the Shannon information of any given orientation is:

    log2(3600) = 11.81378 bits

    If for example I focused on all the possible configurations of the orientations of each coin (rounded to a whole number of degrees) of a system of 500 coins, I’d get:

    log2(360^500) = 500 log2(360) = 4245 bits

    The point of these is examples is to show there is no one Shannon Entropy to describe a system. It is dependent on the way we choose to recognize the possible microstates of the system!

    If we choose to recognize the energy microstates as the chosen microstates to calculate Shannon entropy, that is the thermodynamic entropy of the system expressed in Shannon bits versus the traditional thermodynamic entropy measure in Joules/Kelvin.

    Consider a system of 500 pure copper pennies. The standard molar thermodynamic entropy of copper is 33.2 Joules/Kelvin/Mol. The number of mols in a pure copper penny is .0498 mols thus the thermodynamic entropy of 500 coins is roughly on the order of:

    S = 500 * 33.2 Jolues/Kelvin/Mol * 0.0498 mols = 826.68 J/K

    We can relate thermodynamic entropy SShannon expressed in Shannon Bits to traditionally-expressed thermodynamic entropy number SBoltzmann in Joules/Kelvin. Simply divide SBoltzmann by Boltzmann’s constant (KB) and further divide by ln(2) (to adjust from natural log to log base 2 information measures):

    S_Shannon = S_Boltzmann / kB / ln(2) =

    (826.68 J/K ) / (1.381x 10-23 J/K) / .693147 = 8.636 x 1025 Shannon Bits

  7. How did you determine U the internal energy?

    The temperature. 🙂

    In an ideal gas the temperature and number of particles determines the internal energy.

    I put the Sakur-Tetrode approximation in an excel spreadsheet here which only needs 3 inputs in the orange cells (A1, A2, A3) and they are:

    moles of gas
    Volume
    Temperature

    See:
    http://creationevolutionuniversity.org/public_blogs/skepticalzone/absolute_entropy_helium.xls

    The temperature is used to calculate internal energy according to ideal gas laws:
    https://en.wikipedia.org/wiki/Ideal_gas

    Again, the Clausius version does most of the entropy calculations with heavy reliance on empirical measurements (thermometers and calorimeters) and then further calculations are usually extrapolations from empirical measurments (like the Shomate equation).

    The Sakur-Tetrode equation is a quantum mechanical improvement of Boltzmann and builds the entropy calculation up from first principles, but even then it won’t give as accurate a result as the Clausius derived values since Clausius is driven by actual measurement rather than theory. Clausius will highlight aspects of gases that aren’t ideal.

    The problem using Boltzmann is that exact descriptions rather idealized descriptions of molecular behavior are mathematically intractable. The best way to compute entropy in most cases is to use thermometers and calorimeters rather than giving a theoretical description of Boltzmann microstates.

  8. colewd: A second reason to share a pbj

    I do love me some pbj.

    I recently completed reading Ben-Naim’s The Briefest History of Time: The History of Histories of Time and the Misconstrued Association between Entropy and Time and am on the last chapter of his Information, Entropy, Life and the Universe: What We Know and What We Do Not Know.

    He absolutely rips popular science writing on information, entropy and the second law. You simply would not believe how much nonsense is running around out there.

  9. stcordova: No process is possible whose sole result is the transfer of heat from a cooler to a hotter body.

    So it’s like, not impossible. It’s just that it can’t the the sole result. Whatever that means.

  10. stcordova: I don’t have huge issues about what is what, but the fact everyone is trying redefine the meaning of the 2nd law and Boltzmann adds a little murkiness.

    The simplest statement of the 2nd law was by Clausius:

    That’s why cold objects cool heat up in a hot environment naturally.The only way to do otherwise is to have anrefrigerator, but then this happens because their is a consumption of energy in the process and hence the process does not have a “sole result” of heat energy going from a cold object to a hot environment — energy must be expended.

    When the 2nd law began to be expanded beyond Clausius simple definition, it causes ambiguity as to what entropy really means.When the 2nd law is framed simply, the definition of thermodynamic entropy is framed simply.

    Boltzmann’s original formula:

    S = kB ln W

    conceived of microstates in terms of all the 6-dimesional coordinates of the molecules in a system. The 6 dimensions being composed of the 3 spatial dimensions along X, Y, Z in Euclidian space andthe 3 momentum dimensions of each particle P_x, P_y, P_z in “momentum space” for a total of 6-dimensions.

    Now Boltzmann got a little lucky because this 6-dimensional space is an idealized Newtonian view of molecules, but it is close enough to the quantum mechanical view for the properties of interest.The modern notion of thermodynamic microstate needs quantum mechanics, but Newtonian (classical) mechanics is often good enough.

    However, when one is talking about head/tails of coins and the entropy associated with this, usually one isn’t thinking of microstates in terms of individual molecules in the coins, but the heads/tails configuration.Shannon entropy allows picking of what the observer deems appropriate as a microstate.Boltzmann thermodynamics restricts the microstates to being the set of X,Y,Z, P_x, P_y, P_z coordinates of the molecules, and modern quantum mechanically based thermodynamics restricts the microstates to quantum microstates(ironically, sometimes actually easier than the Newtonian approach of Boltzmann).

    I gave a discussion of this here a few years ago:

    http://www.uncommondescent.com/physics/shannon-information-entropy-uncertainty-in-thermodynamics-and-id/This essay is intended to give a short overview of textbook understanding of Shannon Information, Entropy, Uncertainty in thermodynamics and Intelligent Design. Technical corrections are welcome.

    The phrases “Shannon Information”, “Shannon Uncertainty”, “Shannon Entropy” are all the same. The most familiar every day usage of the notion of Shannon Information is in the world of computing and digital communication where the fundamental unit is the bit.

    When someone transmits 1 megabit of information, the sender is sending 1 megabit of Shannon information, and the receiver is getting a reduction of 1 megabit of Shannon Uncertainty, and the ensemble of all possible configurations of 1 megabits has 1 million degrees of freedom as described by 1 megabit of Shannon entropy.

    The word “Information” is preferred over “Uncertainty” and “Entropy” even though as can be seen they will yield essentially the same number.

    The probability of any given configuration (microstate) can be converted by the following formula:

    2^-I = P

    conversely

    I = -log2(P)

    Where I is the information measure in bits and P is the probability of finding that particular configuration (microstate).

    Examples:
    The Shannon information of 500 fair coins heads is 500 bits, and the probability of a given configuration of the 500 coins is

    2^-I = 1 out of 2500

    But there are subtleties with the 500 coin example. We usually refer to the Shannon entropy in terms of heads/tails configuration. But there are other symbols we could apply Shannon metrics to.

    For example, we could look at the 4 digit year on a coin and consider the Shannon entropy associated with the year. If we use an a priori assumption of all possible years as equally probable, I calculate the Shannon Entropy as:

    log2 (10^4)= 13.28771 bits

    But this assumption is clearly too crude since we know not all years are equally probable!

    Further we could consider the orientation angle of a coin on a table based on the rounded whole number of degrees relative to some reference point. Thus each coin has 360 possible orientation microstates corresponding to 360 degrees. The number of bits is thus:

    log2(360) = 8.49185 bits

    Notice, the problem here is we could choose even finer resolution of the orientation angle to 0.1 degree and thus get 3600 possible microstates! In that case, the Shannon information of any given orientation is:

    log2(3600) = 11.81378 bits

    If for example I focused on all the possible configurations of the orientations of each coin (rounded to a whole number of degrees) of a system of 500 coins, I’d get:

    log2(360^500) = 500 log2(360) = 4245 bits

    The point of these is examples is to show there is no one Shannon Entropy to describe a system. It is dependent on the way we choose to recognize the possible microstates of the system!

    If we choose to recognize the energy microstates as the chosen microstates to calculate Shannon entropy, that is the thermodynamic entropy of the system expressed in Shannon bits versus the traditional thermodynamic entropy measure in Joules/Kelvin.

    Consider a system of 500 pure copper pennies. The standard molar thermodynamic entropy of copper is 33.2 Joules/Kelvin/Mol. The number of mols in a pure copper penny is .0498 mols thus the thermodynamic entropy of 500 coins is roughly on the order of:

    S = 500 * 33.2 Jolues/Kelvin/Mol * 0.0498 mols = 826.68 J/K

    We can relate thermodynamic entropy SShannon expressed in Shannon Bits to traditionally-expressed thermodynamic entropy number SBoltzmann in Joules/Kelvin. Simply divide SBoltzmann by Boltzmann’s constant (KB) and further divide by ln(2) (to adjust from natural log to log base 2 information measures):

    S_Shannon = S_Boltzmann / kB / ln(2) =

    (826.68 J/K ) / (1.381x 10-23 J/K) / .693147 = 8.636 x 1025 Shannon Bits

    That’s perfect. Great post.

  11. walto: That’s perfect. Great post.

    I disagree. I think it’s full of nonsense and unsupported claims. Most of it is pure dippity-woo.

    I’d love to challenge Salvador, but he has me on ignore. Lucky me 🙂

  12. Sal writes:

    The simplest statement of the 2nd law was by Clausius:

    No process is possible whose sole result is the transfer of heat from a cooler to a hotter body.

    Note that this statement is in regards to two bodies. It says nothing about the thermodynamic system. Is the system an isolated system?

    Then Salvador tells us: “That’s why cold objects cool heat up in a hot environment naturally.”

    Cold objects heat up naturally in a hot environment because no process is possible whose sole result is the transfer of heat from a cooler to a hotter body.

    Does that really explain why “cold objects heat up naturally in a hot environment”? My answer is no. It doesn’t tell us shit about whether or why the cool object heats up. It only tells us that whatever heat there is in the cool body cannot transfer to a hotter body where that heat transfer is the sole result. And it sure has hell says nothing about transfer of heat from the hotter body to the cooler body.

  13. More Salvador nonsense:

    For example, we could look at the 4 digit year on a coin and consider the Shannon entropy associated with the year. If we use an a priori assumption of all possible years as equally probable, I calculate the Shannon Entropy as:

    log2 (10^4)= 13.28771 bits

    But this assumption is clearly too crude since we know not all years are equally probable!

    Salvador claims he’s given us a calculation of he SMI in which all years are equally probable, but that is just pure nonsense. Salvador fails to include in his calculation years before the year 0000 and years after the year 9999. Or let’s grant that he intended to limit his distribution, how did he choose that distribution and reject others?

    It’s not just that not all dates imprinted on coins known to exist are equally probably, it’s that we know coins existed before the year 0000.

    The fact is that Salvador has no idea of what the probability distribution is for his fake example, but this does not prevent him from making these calculations when he has no idea of the probability distribution. This tells me that he does not really understand the Shannon measure of information, much less how it relates to .thermodynamic entropy.

  14. The relationship between Shannon entropy and thermodynamic entropy becomes clear once you understand that “microstate” and “macrostate” are general concepts that apply to any kind of Shannon entropy, not just to thermodynamic entropy.

    Let’s revisit my card deck example:

    Let’s set thermodynamic entropy aside for the moment and talk about what I’ve been calling the “logical” entropy of a deck of cards.

    For the deck of cards, the microstate is the exact sequence of the cards. Given a deck of cards with the numbers 1 through 5 printed on them, the microstate of the deck might be the sequence 4 1 3 5 2. If the cards happen to be ordered, the microstate is 1 2 3 4 5. If they’re reverse-ordered, it’s 5 4 3 2 1.

    There are 5! possible microstates for the deck — that is, 5 x 4 x 3 x 2 x 1 possible sequences, for a total of 120.

    If the only thing I know about the deck is that it’s been randomly shuffled, then the macrostate is basically that single fact. The macrostate is “randomly shuffled”. The microstate might be any of the 120 possible sequences. We don’t know which one actually obtains.

    In other words, the macrostate “randomly shuffled” corresponds to an ensemble of 120 microstates.

    Now suppose that instead of being randomly shuffled, the deck has been prepared this way:

    1) the odd cards have been separated from the evens;
    2) the two “subdecks” have been separately shuffled; and
    3) the “odd” subdeck has been placed atop the “even” subdeck.

    Let’s call this the “odds before evens” macrostate.

    For this macrostate, the number of possible microstates is no longer 120. Some of the sequences have been ruled out, such as this one: 1 4 5 3 2. It doesn’t satisfy the “odds before evens” criterion. There are only 12 sequences that do.

    In other words, the macrostate “odds before evens” corresponds to an ensemble of 12 microstates.

    Finally, suppose you know that the deck has been prepared by placing the cards in increasing order. The macrostate is “increasing order”, and there is only one possible microstate: 1 2 3 4 5.

    In other words, the macrostate “increasing order” corresponds to an ensemble of just one microstate.

    Extra credit: Think about what happens in all of these scenarios if the cards are indistinguishable; for instance, if the cards all have the number 2 printed on them, with no other distinguishing features.

  15. Keeping the generality of “microstate” and “macrostate” in mind, we can make some observations:

    1. The system is in exactly one microstate at a time.

    2. We often don’t know the exact microstate. Whatever information we do have about it constitutes the macrostate.

    3. The entropy is the gap between the information we actually have — the macrostate — and the information that is required to specify the exact microstate.

    4. A microstate can be thought of as a single point in “microstate space”.

    5. A macrostate can be thought of as a region of microstate space, encompassing all of the microstates that are compatible with what is actually known about the system’s microstate.

    6. A macrostate can be as large as the entire microstate space or as small as a single microstate. It depends on what you know. Entropy is observer-dependent.

    7. If the microstates in question are thermodynamic, then the entropy in question is thermodynamic. Any information that narrows down the possible thermodynamic microstates is thermodynamic information.

    8. If the microstates in question are not thermodynamic, then the entropy in question is not thermodynamic.

    Thermodynamic entropy is just a particular kind of Shannon entropy in which the microstates are thermodynamic.

  16. stcordova,

    Boltzmann’s original formula:

    S = kB ln W

    conceived of microstates in terms of all the 6-dimesional coordinates of the molecules in a system. The 6 dimensions being composed of the 3 spatial dimensions along X, Y, Z in Euclidian space and the 3 momentum dimensions of each particle P_x, P_y, P_z in “momentum space” for a total of 6-dimensions.

    I think this is becoming clear. The momentum space should correlate with with temperature. Higher temperature equals higher momentum. Within the same gas since mass is equal so velocity is the variable. When someone describes entropy in bits they are leaving out key information which is the momentum of those bits. So information entropy can be expressed in bits but thermodynamic entropy needs to be expressed in bits( three dimensional euclidian space) and momentum the addition 3 dimensions according to Boltzman’s model. Walto appears to be right; thermodynamic entropy is a subset of information entropy.

  17. Mung,

    I recently completed reading Ben-Naim’s The Briefest History of Time: The History of Histories of Time and the Misconstrued Association between Entropy and Time and am on the last chapter of his Information, Entropy, Life and the Universe: What We Know and What We Do Not Know.

    He absolutely rips popular science writing on information, entropy and the second law. You simply would not believe how much nonsense is running around out there.

    Its fascinating how the slight mis tweaking of a scientific law can sent so many talented people into a process of chasing their tails.

    I think you and Salvador need to share a pbj 🙂

  18. keiths,

    1. The system is in exactly one micro state at a time.

    What are the factors that determine that exact micro state and the next micro state that will be measured in the next measurable point in time?

  19. colewd: This sounds reasonable but to measure thermodynamic entropy you need information that is more detailed then just the number of bits. Bekenstein uses bits to calculate the entropy of a black hole.

    Ben-Naim writes that in a private communication Bekenstein “admits that BH entropy is not thermodynamic entropy.”

  20. colewd:

    I think this is becoming clear. The momentum space should correlate with with temperature. Higher temperature equals higher momentum.
    Within the same gas since mass is equal so velocity is the variable.

    Yes, higher average momentum per particle. Exactly. However, they usually say higher temperature is higher internal energy. Momentum P is related to energy by

    P^2 / 2m = 1/2 m v ^2 = Kinetic Energy

    because

    momentum = P = mv

    where
    m = mass
    v = velocity

    Although what you said is correct, it is customary to relate temperature to energy rather than momentum.

    When someone describes entropy in bits they are leaving out key information which is the momentum of those bits. So information entropy can be expressed in bits but thermodynamic entropy needs to be expressed in bits( three dimensional euclidian space) and momentum the addition 3 dimensions according to Boltzman’s model. Walto appears to be right; thermodynamic entropy is a subset of information entropy.

    Walto is right.

    However, though using the term “information” is not incorrect, I’ve stated my preference to avoid that term for thermodynamics because of the confusion it introduces. “logarithm of microstates” is far more accurate which is :

    S = kB ln W (Boltzmann)

    and

    I = log2 (W) (Shannon for equi-probable W)

    where
    W = number of microstates

    microstates in Boltzmann’s world are the set of 6-dimensioanl coordinates of all molecules in a system. But as I said, Boltzmann’s original Newtonian views need some amendment in light of quantum mechanics.

    Shannon can work for non-equiproable W, but that is a whole nother can of worms!

    Higher average momentum of the molecules in a system means a larger “momentum space” hence more 6-dimensional mircostates. More microstates translates into a higher number for the “logarithm of microstates” which some people label as “information”.

    I de-emphasize the term “information” for thermodynamics. It is gratuitous and superfluous, and imho, creates more confusion than clarity.

    The 2nd law of thermodynamics and the thermodynamic entropy are more traditionally understood in terms of temperature and heat, not information.

    One usually can’t even calculate information bits unless one has temperature and heat data, hence information theory for thermodynamics is usually a superfluous and gratuitous add-on confusion factor, imho. That’s why chemistry, physics and engineering students state the entropy in units of J/K not bits! The major place I’ve seen entropy stated in bits is in places like TSZ and UD, not in textbook science literature.

  21. colewd,

    I think this is becoming clear.

    Yet what you’re saying is hopelessly garbled.

    The momentum space should correlate with with temperature. Higher temperature equals higher momentum. Within the same gas since mass is equal so velocity is the variable. When someone describes entropy in bits they are leaving out key information which is the momentum of those bits.

    Category error. Bits do not have momentum. Molecules do.

    So information entropy can be expressed in bits but thermodynamic entropy needs to be expressed in bits( three dimensional euclidian space) and momentum the addition 3 dimensions according to Boltzman’s model.

    No. Thermodynamic entropy, just like the other forms of Shannon entropy, can be expressed in bits. It’s a measure of missing information, not of energy dispersal.

  22. Sal,

    However, though using the term “information” is not incorrect, I’ve stated my preference to avoid that term for thermodynamics because of the confusion it introduces.

    Interesting. Earlier you were referring to it as “deepity woo”. Now you admit that it’s correct. That’s quite a climb-down.

    Also, keep in mind that whether it confuses you is irrelevant. It’s correct, and the energy dispersal view is not.

  23. keiths: Earlier you were referring to it as “deepity woo”. Now you admit that it’s correct. That’s quite a climb-down.

    Consider it a climb up. Next Salvador will be telling us this has been his view for years.

  24. stcordova: Shannon can work for non-equiproable W, but that is a whole nother can of worms!

    The SMI can be calculated for any probability distribution. That’s basic info theory. There’s no “whole nother can of worms” about it.

  25. keiths:

    1. The system is in exactly one microstate at a time.

    2. We often don’t know the exact microstate. Whatever information we do have about it constitutes the macrostate.

    colewd:

    What are the factors that determine that exact micro state and the next micro state that will be measured in the next measurable point in time?

    First, microstates are generally not “measured.” If you could know the exact microstate, then the entropy would be zero for you.

    Second, what determines the microstate is the nature of the system. For the gas-mixing cases we’ve been discussing, the microstate is constantly changing and is determined by the laws of physics. For the card deck case, the microstate is static until someone reshuffles the cards. The new microstate is determined by how they reshuffle — by choosing the “odds before evens” method, for example.

  26. Let me elaborate on this:

    First, microstates are generally not “measured.” If you could know the exact microstate, then the entropy would be zero for you.

    The card deck example is interesting because in that context we actually do have Damon-like powers. We can determine the exact microstate simply by looking at the cards to see what order they’re in.

    So if we thoroughly and randomly shuffle the deck without looking to see the order of the cards, it has maximum entropy. All of the microstates are possible, and we don’t know which of them actually obtains.

    By looking to see what order the cards are in, we determine the exact microstate. There is no missing information, so the entropy becomes zero.

    What changed about the deck of cards? Nothing. What changed about us? Our state of knowledge — we observed the order of the cards. Entropy is a measure of missing information, and it’s therefore observer-dependent.

  27. keiths,

    Category error. Bits do not have momentum. Molecules do.

    So information entropy can be expressed in bits but thermodynamic entropy needs to be expressed in bits( three dimensional euclidian space) and momentum the addition 3 dimensions according to Boltzman’s model.

    No. Thermodynamic entropy, just like the other forms of Shannon entropy, can be expressed in bits. It’s a measure of missing information, not of energy dispersal.

    If we can measure the amount of missing information, what is the practical application of that knowledge?

  28. colewd,

    If we can measure the amount of missing information, what is the practical application of that knowledge?

    Thermodynamics!

  29. I’m guessing it’s pretty clear to everyone at this point that the issue being obsessed on in this thread is a quibble.

  30. walto,

    I’m guessing it’s pretty clear to everyone at this point that the issue being obsessed on in this thread is a quibble.

    I’m sure that saying so makes you feel better about the mistakes you’ve made in this thread, but it isn’t true.

    As I said earlier:

    No, the problem is that entropy is not a measure of disorder, and it’s not a measure of energy dispersal. It is a measure of missing information — the information gap between the macrostate and the microstate.

    Truth matters in science. You think we should split the difference and go have some pbj’s, but that’s not how science works.

  31. colewd,

    You may have heard of something called the “Second Law of Thermodynamics”? It has some major ramifications. One of the earliest was in establishing an upper bound on the efficiency of heat engines.

  32. walto,

    It’s a bit odd for you to be arguing that this is all just a quibble. You certainly didn’t think so earlier in the thread, when you characterized it in these breathless terms:

    Rather than giving opportunities, you might consider giving a single reason for the entire scientific establishment to shift its paradigm.

    And:

    Quick, rewrite all the science books!

    And now that you’ve reversed yourself on energy dispersal and the missing information view, why not take the final step and acknowledge your error regarding observer dependence?

  33. keiths: If you could know the exact microstate, then the entropy would be zero for you.

    The entropy is not a property of the observer, it’s a property of the thermodynamic system.

  34. keiths: Thermodynamic entropy, just like the other forms of Shannon entropy, can be expressed in bits.

    I think it’s better to say that the SMI (which is in bits) can be expressed in the units of thermodynamics.

  35. keiths: I’m guessing it’s pretty clear to everyone at this point that the issue being obsessed on in this thread is a quibble.

    I’m sure that saying so makes you feel better about the mistakes you’ve made in this thread, but it isn’t true.

    Well, of course you’re not going to admit it out loud. But I’m guessing it’s even clear to YOU at this point. I mean you’ve shifted on what thermodynamic entropy is and now recognize that Damon has to ‘forget’ stuff to get anywhere near it, so I’m guessing you’ve grokked the whole picture by now. You’re smart enough to have done that, I think.

    Anyhow, even though I’m charitable like that, I’m not expecting you to actually say it.

  36. Mung

    The entropy is not a property of the observer, it’s a property of the thermodynamic system.

    It’s a property of the relationship between the observer and the thermodynamic system. More information about the exact state of the system means less entropy.

  37. walto,

    I mean you’ve shifted on what thermodynamic entropy is and now recognize that Damon has to ‘forget’ stuff to get anywhere near it,

    No, that’s just confusion on your part. My position hasn’t changed. Damon sees a thermodynamic entropy of zero, and he is correct. Xavier and Yolanda see nonzero values of thermodynamic entropy, and they are also correct.

  38. keiths:

    Thermodynamic entropy, just like the other forms of Shannon entropy, can be expressed in bits. It’s a measure of missing information, not of energy dispersal.

    Mung:

    I think it’s better to say that the SMI (which is in bits) can be expressed in the units of thermodynamics.

    No, because that implies that bits are not legitimate units of thermodynamic entropy. That’s false. Bits are legitimate units of any Shannon entropy, including thermodynamic entropy.

    Don’t repeat Sal’s mistake.

  39. Keith, if you backslide, you WILL have to rewrite all the science books to make them useless. I thought you’d gotten past that. [sigh]

  40. Keiths as quoted by colewd:

    It’s a measure of missing information, not of energy dispersal.

    Keiths has problems comprehending that thermodynamic entropy can be 3 things at the same time, and they are not mutually exclusive!

    Thermodynamic entropy is energy dispersal (Clausius), logarithm of microstates (Boltzmann), and uncertainty of which Boltzmann microstate ( aka Shannon information).

    1. Clausius entropy:

    ds = dq (reversible) / T

    2. Boltzmann entropy:

    S = kB ln W

    3. Shannon information =

    I = S / kB / ln(2) = log2(W) = -log2(1/W) = -log 2 (P)

    where,

    dq (reversible) = differential change in heat needed to restore to initial state
    T = temperature
    W = is set of 6-dimensional position/momentum microstates or the quantum mechanical microstates

    P = probability of being in 1 of the W microstates = 1 / W provided each microstate is equiprobable

    Keiths is determined to claim I’m wrong, Lambert is wrong, Dr. Mike is wrong, Gupta is wrong, Leff is wrong, Townsend is wrong, Kotz is wrong, Triechel is wrong, Clausius is wrong, even Lord Kelvin is wrong.

    I essentially showed one can relate all 3 forms (Clausius, Boltzmann, Shannon) for 500 copper pennies.

    By measuring heat input into copper from absolute zero (using thermometers and calorimeters) we determine experimentally the molar entropy of copper at room temperature using the Clausius integral from of entropy, from which we can determine the number of Boltzmann microstates, from which we determine Shannon entropy. Keiths has it backward, that’s why he can’t start from “missing infomraiton” and then compute the Clausius entropy.

    The Keiths way bassackward way that never works:

    Missing information -> Boltzmann -> Clausius

    The correct way:

    Claussius -> Boltzmann -> Shannon uncertainty (aka missing information)

    I showed the right procedure with the 500 copper coins. Using thermometers and calorimeters we empirically measure the heat and temperature as we bring copper from near absolute zero to room temperature, and then using this data we plug it in to the Clausius integral and get the MEASURED entropy as:

    Clausius entropy = 826.68 J/K

    The number of position/momentum (Newtonian) micrsotates or energy microstates (quantum mechanical) is 5.986 x 10^25.

    So W = 5.986 x 10^25

    therefore:

    Clausius entropy = 826.68 J/K

    Boltzmann Entropy = kB ln W

    = kB ln (5.986 x 10^25) = 826.68 J/K

    Shannon Entropy = Shannon Uncertainty = Shannon Information

    = Boltmann Entropy / kB / ln(2) = -log2( 1/W) = log2(W) = -log2(P)

    = 826.68 J/K / (1.381x 10^-23 J/K) / .693147 = 8.636 x 10^25 Shannon Bits

    what is the practical application of that knowledge?

    Give Keiths a chance to confuse the issue. Practical uses of information theory as far as traditional uses of thermodynamic entropy are next to zero, that’s because one can’t even compute the “missing information” to begin with unless one has energy dispersal (as in heat and temperature data) to begin with.

    If one has heat and temperature data, one can compute entropy without reference to information theory.

    Ok what is the practical use of the thermodynamic entropy concept. Here is one that has been harped on in this thread.

    We have one mol of gas in 1 cubic meter at 298K, it has the same internal energy as that same mol of gas occupying 2 cubic meters at 298K. However, when that mol of gas occupies 2 cubic meters, even though it has the same internal energy it has higher entropy. Higher entropy means that the gas has less usable energy.

    Another way of seeing it is that at 298K, when the gas is compressed to 1 cubic meter it is at a certain pressure. When it is dispersed to 2 cubic meters it is at a lower pressure. If the gas pressure is used to move a piston, the lower pressure gas can’t do as much work as the higher pressure gas, even though the gas in each scenario has the same internal energy.

    Thermodynamic entropy is another way of comparing usuable energy of systems that may actually have the same internal ( “heat”) energy.

    You can use the link I provided for Sakur-Tetrode to look at how temperature and volume changes internal energy. You’ll see that volume changes (cell B2) do not change internal energy (cell B32), but it does change the entropy (cell F35 or cell B5).

    Higher entropy means less ability to convert the internal energy to useful work. That’s why the entropy concept is important to understanding steam engines, air conditioners, and chemical reactions.

    Clausius realized conversion of heat energy to work isn’t just about the number of Joules of heat, it’s about factoring how the heat is used at various temperatures. This factoring temperatures in the conversion of heat to work created the entropy concept. You see it in the Clausius integral:

    dS = dq (reversible)/ Temperature

    Qualitatively speaking, take 2 systems with the same internal energy. The system with lower entropy has a better capacity to convert its internal energy to mechanically useful work.

    You can’t really do that from Keiths information theory approach since Keiths can’t even compute his missing information without energy dispersal data (heat and temperature).

  41. Cool calculation, bro.
    What would the value be if the coins were made of ununtrium, rather than copper?

    BTW, if you use Gibbs, rather than Boltzmann, you’ll have a better response to colewd’s question re momenta.

  42. walto:

    Keith, if you backslide, you WILL have to rewrite all the science books to make them useless. I thought you’d gotten past that. [sigh]

    No backsliding on my part; just confusion on your part. You took the fact that Damon doesn’t need the concept of entropy as an indication that entropy isn’t valid for him. It doesn’t follow.

    The only actual “backsliding” here has been yours.

    You’ve slid from agreeing with Jaynes…

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for.

    …back to disagreeing:

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    That’s wrong. Jaynes is right, and so are Xavier, Yolanda, and Damon. Each of the latter is calculating the entropy using a legitimate macrostate, and each obtains a correct value. The difference in entropy is due to the difference in information possessed by each of the three.

    As I explained to Mung:

    It’s [Entropy is] a property of the relationship between the observer and the thermodynamic system. More information about the exact state of the system means less entropy.

    Entropy is observer-dependent. This shouldn’t be surprising to you, given that you actually accept the “missing information” view. Do you actually think there’s a law of nature saying “every observer must always be missing exactly the same amount of information”? Why on earth would you think that?

  43. stcordova: The major place I’ve seen entropy stated in bits is in places like TSZ and UD, not in textbook science literature.

    The statistical approach to thermodynamics is well-established and has been for decades.

  44. walto: I’m guessing it’s pretty clear to everyone at this point that the issue being obsessed on in this thread is a quibble.

    I, for one, am trying to do my part in keeping the thread alive. 😉

Leave a Reply