In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. DNA_Jock: It explains WHY “Heat cannot, of itself, pass from one body to a hotter body”

    Heh. Not even if it’s hot enough to melt the frozen in entropy?

  2. colewd: How does it explain this?

    Imagine a tabletop covered with nickels and dimes, flying around and bumping into each other. If a “heads” bumps into a “tails”, they both flip. There are more microstates where the heads are evenly distributed between nickels and dimes than microstates where almost all the heads are on one type of coin. Over time, the proportion of heads will become roughly equal between nickels and dimes.
    THAT’s why “Heat cannot, of itself…”

    Einsteins GR equations did predict how a small mass particle (photon) would be followed a curved path in the presents of a large mass (sun). The equations were also experimentally validated.

    Yes. So?

    I am still struggling with what problem is solved by describing all the possible micro states of a gas.

    Well Gibbs doesn’t need to describe them, just count them (or more often, calculate the ratio of W1 to W2).
    And very few people really care about the isothermal expansion of ideal gases.
    On the other hand, good luck explaining the chelate effect or the hydrophobic effect in Clausian terms…

  3. Sal is mostly repeating mistakes that were pointed out earlier in the thread. This one, however, appears to be a new mistake:

    But Q-reversible is the energy required to unmix the system! There will be heat dispersed into the environment to reverse the process.

    There’s a lot of wrongness in those two sentences.

    First, we’ve been talking about mixing, not unmixing. The entropy increases during mixing but energy dispersal does not. Therefore entropy cannot be a measure of energy dispersal.

    Second, the mixing process is not reversible. “Reversible” in thermodynamics is a technical term. It does not refer merely to something that can be undone; it refers to a process that can take place with no net change to the entropy of the universe, and can then be reversed, also with no net change to the entropy of the universe.

    The system is isolated during mixing, meaning that there is no exchange of energy with the environment. The entropy of the system increases due to mixing, but the entropy of the environment does not. Therefore there is a net increase in the entropy of the universe, and the process is not reversible. (Successfully running it backwards would decrease the entropy of the universe, in violation of the Second Law.)

    Third, if the mixing actually were a reversible process, as Sal claims, then any heat removed from the system during unmixing would have to be exactly equal to the heat added during mixing. No heat is added during mixing. Therefore, if the mixing were reversible, no heat would be removed during unmixing, contrary to Sal’s claim.

    Fourth, it should be obvious that heat cannot be removed during unmixing, because that would decrease the temperature. The ending temperature would be lower than the starting temperature. You haven’t reversed a process unless you end up back at the starting line.

    Fifth, merely removing heat from the system is not going to get you back to a condition in which the molecules of gas A are all on one side of the chamber and the molecules of gas B are all on the other.

    Sal has no business trying to teach thermodynamics to anyone. He doesn’t even understand the basics.

  4. We’ve mostly been talking about gases so far. Here’s a solid-state example of why entropy is not a measure of energy dispersal, from John Denker of Bell Labs:

    As another example, consider two counter-rotating flywheels. In particular, imagine that these flywheels are annular in shape, i.e. hoops, as shown in figure 9.7, so that to a good approximation, all the mass is at the rim, and every bit of mass is moving at the same speed. Also imagine that they are stacked on the same axis. Now let the two wheels rub together, so that friction causes them to slow down and heat up. Entropy has been produced, but the energy has not become more spread-out in space. To a first approximation, the energy was everywhere to begin with and everywhere afterward, so there is no change.

    If we look more closely, we find that as the entropy increased, the energy dispersal actually decreased slightly. That is, the energy became slightly less evenly distributed in space. Under the initial conditions, the macroscopic rotational mechanical energy was evenly distributed, and the microscopic forms of energy were evenly distributed on a macroscopic scale, plus or minus small local thermal fluctuations. Afterward, the all the energy is in the microscopic forms. It is still evenly distributed on a macroscopic scale, plus or minus thermal fluctuations, but the thermal fluctuations are now larger because the temperature is higher. Let’s be clear: If we ignore thermal fluctuations, the increase in entropy was accompanied by no change in the spatial distribution of energy, while if we include the fluctuations, the increase in entropy was accompanied by less even dispersal of the energy.

    He’s right, and Sal is wrong. Again.

  5. Hi keiths, how do you propose that we measure the entropy of the universe? What calculation do we use to determine the entropy of the universe?

    According to Ben-Naim the entropy of the universe is undefined and therefore a nonsense term.

  6. Mung,

    The absolute entropy of the universe doesn’t matter. It’s the change in entropy due to a particular process that concerns us here.

  7. keiths: The absolute entropy of the universe doesn’t matter. It’s the change in entropy due to a particular process that concerns us here.

    Not sure why you think that helps your claims. The change in entropy of what, the universe? What would that even mean?

    In order to determine the entropy change you need to know the state of the universe, because the entropy is a state function. What state is the universe in?

    Do you know the temperature, volume, and number and types of all the particles in the universe?

    You really need to read more Ben-Naim. 🙂

  8. Mung,

    You need to read up on reversible processes.

    From Wikipedia:

    A reversible process changes the state of a system in such a way that the net change in the combined entropy of the system and its surroundings is zero.

    Note the word “surroundings”.

  9. As keiths notes, physicists talk of the change in entropy of the rest of the universe all the time. Any system that exports energy increases the entropy of the rest on the universe by at least dQ/T. The fact that this heat, along with all the other heat in the universe, is flowing down temperature gradients in immeasurable quantities doesn’t affect the validity of the statement at all.

  10. DNA_Jock: As keiths notes, physicists talk of the change in entropy of the rest of the universe all the time.

    They also claim the universe gets more disorded, and that the energy of the universe gets more dispersed. Nothing prevents physicists from saying nonsensical things. In fact, that’s at least one thing this whole thread has been about.

    So it ought to be no surprise that they might say more nonsensical things, especially when it comes to entropy, the second law of thermodynamics, and the arrow of time.

    And what does it means to say that the “missing information” of the universe is increasing? As we learn, we become more ignorant? That’s a humbling thought.

    🙂

  11. The second law of thermodynamics for isolated systems states that the entropy of an isolated system not in equilibrium tends to increase over time, approaching maximum value at equilibrium.

    If the entropy is increasing in an isolated system, does that mean the entropy is also increasing in the universe? Because, you know, the isolated system is part of the universe.

  12. : Stupid Physicist Sayings

    This is the story of the bit and the universe. The universe is the biggest thing there is and the bit is the smallest possible chunk of information. The universe is made of bits. Every molecule, atom and elementary particle registered bits of information.

    – Seth Lloyd (2006)

  13. Mung,

    Since you are echoing (or trying to echo) Ben-Naim here, why not provide a relevant quote from him? Then the rest of us can tell you whether we agree or disagree, and why.

  14. Mung:

    In order to determine the entropy change you need to know the state of the universe, because the entropy is a state function. What state is the universe in?

    Do you know the temperature, volume, and number and types of all the particles in the universe?

    That’s goofy, Mung.

    If I deposit $10,000 into my checking account, the change in balance associated with that transaction is $10,000. I can say that even if I don’t know the balance before and after the transaction.

    Let B be the “before” balance, and let ΔB be the amount of the deposit. Then the “after” balance is B + ΔB.

    The change in balance is the “after” balance minus the “before” balance, or

    B + ΔB – B = ΔB

    The Bs cancel out. The answer depends only on ΔB, not on B.

  15. Mung,

    If the entropy is increasing in an isolated system, does that mean the entropy is also increasing in the universe? Because, you know, the isolated system is part of the universe.

    Yes, the entropy is also increasing in the universe. Can you work out how we know that?

  16. Mung,

    And what does it means to say that the “missing information” of the universe is increasing?

    The same thing it means to say that the missing information of any isolated system is increasing.

    As we learn, we become more ignorant? That’s a humbling thought.

    Keep in mind that the Second Law is probabilistic, not absolute. In the very long run, there can be downward fluctuations in entropy. That’s what Boltzmann brains are about.

  17. keiths: Yes, the entropy is also increasing in the universe. Can you work out how we know that?

    In terms of “missing information” measured in “bits” using thermodynamcs? I confess I cannot. Can you?

    Do you know the Pressure of the universe? Do you know the Volume of the universe? Do you know the Temperature of the universe? Do you know the total energy of the universe? Do you know the number of particles? Do you know whether the universe is at equilibrium?

    I’m beginning to get a sense of what walto must have been experiencing. Yes, if there was a change in the balance of your bank account there was a change in the balance your bank account. Well, yeah.

    If you withdraw 10,000.00 from your bank account does the bank account of the universe increase or decrease or remain unchanged? If you deposit 10,000.00 into your bank account does the bank account of the universe increase or decrease or remain unchanged?

  18. keiths: Keep in mind that the Second Law is probabilistic, not absolute.

    So “the entropy of the universe” can, in fact, decrease?

  19. keiths: The same thing it means to say that the missing information of any isolated system is increasing.

    The universe is an isolated system?

  20. Try to focus, Mung.

    Do you see why, if I deposit $10,000 into my checking account, the change in balance associated with that transaction is $10,000? Can you see that this remains true even if I don’t know the account balance?

    Once you understand this, we can move on to entropy.

  21. keiths: Since you are echoing (or trying to echo) Ben-Naim here, why not provide a relevant quote from him?

    A quote from him about the entropy of the universe? Sure.

    In the meantime, you can tell us how to measure the entropy of the universe.

  22. keiths:

    Keep in mind that the Second Law is probabilistic, not absolute.

    Mung:

    So “the entropy of the universe” can, in fact, decrease?

    Yes, though it’s enormously unlikely to do so anytime soon.

    Ditto for smaller isolated systems, like the containers of gas we’ve been talking about. It’s not impossible for all the molecules of an evenly distributed gas to concentrate themselves in one corner of the container — just mind-bogglingly unlikely.

  23. Mung,

    In the meantime, you can tell us how to measure the entropy of the universe.

    Why, when nothing I’m saying depends on knowing the value?

  24. keiths: Try to focus, Mung.

    I am focused. Let’s assume your bank account is an isolated system. How do you pass your 10k across the boundary? What travels back across the boundary in the opposite direction, your balance?

  25. keiths: Yes, though it’s enormously unlikely to do so anytime soon.

    Entropy is not a function of time. “Anytime soon” has nothing to do with it. But at least you are now talking about changes in entropy of the universe. How do we measure the before_entropy and the after_entropy of the universe?

  26. Let’s assume your bank account is an isolated system. How do you pass your 10k across the boundary?

    If my bank account is an isolated system, then nothing passes across the boundary. By definition.

    You know the definition of an isolated system, right? It’s a simple matter of applying the definition. Why can’t you figure these things out for yourself?

  27. keiths: Why, when nothing I’m saying depends on knowing the value?

    If you want to claim that “the entropy of the universe” is a meaningful concept then you ought to be know the before and after values, because if you don’t, you can’t calculate the change.

    How do you know 10k was deposited into your bank account? You look at he balance before the deposit and the balance after the deposit.

    balance_after_deposit – balance_before_deposit = deposit_amount

    It’s true.

  28. Mung,

    Do the words “before” and “after” have anything to do with time, in your opinion?

  29. Mung:

    How do you know 10k was deposited into your bank account?

    I deposited it, doofus. The change in balance due to my deposit was $10,000, and I can say that even if I don’t know the balance.

    Are you drinking tonight, by any chance?

  30. keiths: Do the words “before” and “after” have anything to do with time, in your opinion?

    Sure. But if you want to show that entropy is a function of time you’ll need more than that. As time changes, entropy changes as a function of time … How? You might want to revisit your arguments against energy dispersal.

    Why would you even be pursuing this? Have you actually thought this out previously or do you simply think I must be wrong because … ?

  31. keiths: I deposited it, doofus. The change in balance due to my deposit was $10,000, and I can say that even if I don’t know the balance.

    So you merely think that you deposited 10k. got it.

    How do you know there was a change in your balance? How to you know how much of a change to your balance there was?

    Without actually checking.

    Faith?

    Are you drinking tonight, by any chance?

    Yes, I’ve had some wine. So? I still know how to check my balance. 🙂

    Chill dude. “I know my balance changed because I made a deposit” leaves a lot to be desired. Most people know by checking their balance after the transaction.

  32. Most people know by checking their balance after the transaction.

    Smart people understand that B + ΔB – B = ΔB .

    Perhaps we should continue this discussion after your BAC drops back down.

  33. Hi keiths,

    Have you revised your opinion yet about what constitutes a process in thermodynamics? Do you still think that if the macrostate does not change there is no process? Do you still think that the “mixing” of indistinguishable particles is not a process because there’s no change in entropy?

  34. keiths: Smart people understand that B + ΔB – B = ΔB .

    Practical people check their B, especially if they are depositing 10k. The flip side of this is, of course, that if I withdraw 10k, I actually count the money, I don’t rely on the balance statement.

  35. Hi keiths,

    Have you revised your opinion yet about whether Shannon information is information in the colloquial sense? Do you still think that Claude Shannon gave us a way to measure any and all information?

  36. keiths: Perhaps we should continue this discussion after your BAC drops back down.

    And perhaps you should provide some evidence that I am being irrational.

    balance_after_deposit – balance_before_deposit = deposit_amount

    p.s. Please explain how the balance changes with time, except when the balance does not change with time, because the balance is a function of time.

  37. Reviewing Sal’s comments, I found another mistake of his that I haven’t yet addressed:

    Suppose we had the single helium particle in 2 cubic meters of volume at 300K. Using the Sakur-Tetrode approximation, the absolute entropy (using the excel spreadsheet I provided) is:

    entropy at 2 cubic meters = 1.026186 J/K x 10^-21

    entropy at 1 cubic meters = 1.016616 J/K x 10^-21

    The change in entropy when we compress the volume the particle occupies at the same temperature from 2 cubic meters to 1 cubic meter is:

    entropy change during compression is = 9.5699 x 10^-24 J/K

    Sal claims, following Lambert, that entropy is a measure of energy dispersal. Where is the change in dispersal here? In both cases, all of the system’s energy is concentrated in a single helium atom. The number of possible microstates has changed — there are fewer locations for the atom to occupy — but energy dispersal remains the same.

    Entropy changes, but energy dispersal doesn’t. They are not the same thing.

  38. fifth,

    Is the precise order of a frames in a movie a “logical” microstate?

    keiths:

    You can define it as one, in which case the other microstates would be created by cutting and splicing. I have no idea why you think a video of a tornado is relevant, however.

    fifth:

    Because according to this perspective we are entitled to say that structures don’t move from less organization to more organization and Sewell has a point. contrary to what you asserted earlier.

    Huh? That does not follow. At all.

  39. Keiths and DNA_jock are proponents of framing entropy in terms of ignorance of the observer. I think that is a convoluted way of looking at the issues. I pointed out in the OP:

    the correct Ludwig Boltzman Entropy as written by Planck:

    S = k log W

    where
    S = entropy
    k = boltzman’s constant
    W = number of microstates

    Also there is Clausius:

    delta-S = Integral (dq/T)

    DNA_jock and Keiths pretend I didn’t even write those words, nor even connect Boltzmann to information theory as I did in 2015:

    2LOT and ID entropy calculations (editorial corrections welcome)

    The thermodynamic entropy in J/K can be converted to bits by simply dividing by Boltzman’s constant and then converting the natural log measure to log-base-2 measure.

    Boltzmann’s constant is 1.381x 10-23 J/K).
    The natural log to log-base-2 conversion is ln(2) = .693147.

    But many real world measurements of entropy are done with thermometers and calorimeters. That’s why DNA_jock and Keiths can’t compute the change in entropy of a 20 gram melting ice cube using their non-existent ignorance meters.

    One can google “ignorance meter” and about the only substantive hit is this one. Not much good for making entropy measurements is it. It does however highlight the folly of observer-dependent measurements in trying to make objective measures of entropy:

  40. keiths: I deposited it

    I thought you were analogizing your checking account to the universe–which is an isolated system. You’d have to be inside your checking account, to make this analogy work, wouldn’t you?

  41. stcordova:
    Keiths and DNA_jock are proponents of framing entropy in terms of ignorance of the observer.I think that is a convoluted way of looking at the issues.I pointed out in the OP:

    DNA_jock and Keiths pretend I didn’t even write those words, nor even connect Boltzmann to information theory as I did in 2015:

    http://theskepticalzone.com/wp/2lot-and-id-entropy-calculations-editorial-corrections-welcome/

    But many real world measurements of entropy are done with thermometers and calorimeters.That’s why DNA_jock and Keiths can’t compute the change in entropy of a 20 gram melting ice cube using their non-existent ignorance meters.

    One can google “ignorance meter” and about the only substantive hit is this one.Not much good for making entropy measurements is it.It does however highlight the folly of observer-dependent measurements in trying to make objective measures of entropy:

    I think you’re missing the quibbling aspect of this dispute. Suppose you were interested in measuring the difference in the weight of an object as measured with something made by a middle-schooler with marked cardboard, string and a potato, and as measured with a fancy digital weighing instrument used at MIT. When you calculate the difference–are you quantifying a difference in weight? Yes. Is it also a measure of the difference in information? Yes.

  42. Sal,

    I think we can agree that if you were trying to determine the entropy of a system under the missing information view, then you would go looking for an “ignorance meter”. Fortunately, the worth of an idea isn’t determined by its use in the hands of the incompetent.

    Someone who actually understands thermodynamics would recognize that thermometers, calorimeters, etc., give us information about a system — information that can be used to specify a macrostate — and that entropy is a measure of the gap between the information contained in the macrostate and the information needed to specify the exact microstate.

    Even you seem to get that at least some of the time. For example, you’ve written:

    One will note that if the volume is increased , the entropy (and hence number of microstates) increases. If one shrinks the volume, the entropy (and hence number of microstates) decreases…

    So it is easy to state the change in entropy by saying things like, “increases in temperature and/or increases in volume increase the number of microstates, therefore increase entropy.”…

    When we say “ignorance” we mean to say, we are not sure specifically which microstate the particle is in. When there are more microstates, we become even more uncertain which microstate the particle is in, hence we can say we increased “ignorance” of the system by increasing temperature or volume.

    The problem is that you can’t keep your position straight. Elsewhere you’ve expressed agreement with Elzinga’s assertion that entropy has nothing to do with information, and here you’re pretending that the missing information view depends on the existence of “ignorance meters”.

    Which is it? Make up your mind, please.

    (Hint: walto was right to abandon dispersalism and embrace the “missing information” view, painful though that must have been. The dispersalist position is hopeless, for reasons that have been pointed out again and again in this thread, but no one has produced even a single scenario in which the “missing information” interpretation fails.)

  43. walto,

    I thought you were analogizing your checking account to the universe–which is an isolated system. You’d have to be inside your checking account, to make this analogy work, wouldn’t you?

    The checking account balance is analogous to the total entropy of the universe.

    If I make a deposit of ΔB dollars, I know that the account balance B changes by ΔB dollars, even if I don’t know what B is.

    Likewise, if a thermodynamic process generates an entropy change of ΔS, I know that the effect of that process on the total entropy S of the universe is ΔS, even if I don’t know the value of S.

  44. Walto:

    I think you’re missing the quibbling aspect of this dispute.

    You are probably right, but it’s probably because there is so much more to thermodynamics than the issues about information that I missed the quibble.

    Let me give another angle to this as it relates to colewd’s quote of Einstein. One can interpret Boltzmann:

    S = kB ln W

    as measure of our ignorance about the details of a system. I already said I don’t like this interpretation of Boltzmann. Even though the information theoretic description might be shown to be formally correct, it is a convoluted way of looking at things.

    It can also be shown that it is formally correct to say entropy is simply a scaled logarithmic measure of microstates. This would be a less convoluted way of interpreting Boltzman — where microstates are the set of coordinates for all particles in terms of (X, Y, Z, P_x, P_y, P_z). The microstate description is more rigorous and less deepity woo than the information theoretic description.

    But if we say we lack information or are ignorant, I’m tempted to say “what else is new Captain Obvious!” What is important from a chemical and engineering standpoint is that the measure of our ignorance and lack of information in principle is an indicator of how energy will flow in car engines and chemical reactions.

    When you calculate the difference–are you quantifying a difference in weight? Yes. Is it also a measure of the difference in information? Yes.

    There is observer-dependent information, and there is observer-INdependent information. When there is observer-independent information, we may as well dispense with information theory altogether.

    When a chem student can calculates the change in entropy in a melting ice cube without information theory, then practically speaking, imho, information theory is a gratuitous superficial add-on. It’s not formally wrong, but of little utility.

    Taking your example with thermometers. Say we have a student with an old mercury thermometer that is hard to read vs. one that is high end digital. We have the inherent belief that when there is a change of entropy in a melting ice cube, there is a real change of entropy and there is a “right” answer for the change in entropy as far how much heat was involved in the process (going from ice to liquid).

    There maybe difference in information due to our instrument errors. BUT the ignorance due to our instrument errors is different than the “ignorance-in-principle” or the observer-INdependent information of the system.

    What do I mean by observer-INdependent information?

    I said hypothetically, a 20 gram ice cube melted to liquid water, it has a change of entropy of:

    6660 J/ 273 K = 24.39 J/K = 24.39 J/K / kB = 24.39 J/K / (1.381x 10^-23 J/K) =

    1.7665 x 10^24 natural bits or

    1.7665 x 10^24 x (1/ ln(2) Shannon Bits/ natural bits) =

    2.5485 x 10^24 Shannon bits

    Everyone, independent of their lab equipment will say the amount of information in each microstate, or the amount of ignorance we have in the system, should objectively calculate the same figure for the hypothetical scenario given that we are using idealized thermometers and calorimeters. We have defined objectively what are the limits of our knowledge, namely restricted to idealized thermometers and calorimeters.

    We may have different actual instrument measurements, but we agree on what the objective value ought to be if we had accurate instruments. The ignorance and information is a matter of principle, not observer dependence.

    The problem is people can conflate their subjective information with the hypothetical objective information for a given process.

    I was just harping on Keiths, that his ignorance approach to entropy is almost a Captain Obvious approach. Unless this the quantified ignorance is tied to how energy behaves in a system, for real world applications it’s a near useless fact. As I’ve tried to show, one cannot even calculate the bits of ignorance in the first place without energy temperature data such as in the melting ice cube example.

    Hence, the for many practical applications entropy, information theory is not of immediate relevance. It should be worth noting Clausius definition of the 2nd law which motivated his definition of entropy (a non-rigorous paraphrase):

    Energy will not flow spontaneously from a low temperature object to a higher temperature object.

    The slightly more rigorous statement by Clausius himself:

    Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time.

    Clausius defined entropy according this version of the 2nd law. It is worth noting how energy-centric and temperature-centric this statement of the 2nd law is, and it is worth noting Clausius definition of entropy is energy-centric and temperature-centric as well!

    The Boltzman/Shannon formulation of entropy are nice academic exercises in quantifying lack of information or abundance of ignorance, but as colewd’s quote of Einstein suggested, without connection of these information metrics to heat and temperature:

    S = kB ln W

    it borders on being a not very practical theory. To the extent it is can be connected to heat and temperature and thus the 2nd law (which is among the most fundamental laws), then it becomes useful and insightful.

    The value of Boltzmann was that he connected (approximately) the phenomenon of heat and temperature to the statistical properties of lots of billiard balls we call molecules that are approximately Newtonian in behavior. Boltzmann and Gibbs lucked out that this collective Newtonian idealization of molecules is pretty darn close to the more accurate Quantum mechanical description.

  45. stcordova,

    In 1972, Bekenstein was the first to suggest that black holes should have a well-defined entropy. He wrote that a black hole’s entropy was proportional to its (the black hole’s) event horizon. Bekenstein also formulated the generalized second law of thermodynamics, black hole thermodynamics, for systems including black holes. Both contributions were affirmed when Stephen Hawking proposed the existence of Hawking radiation two years later. Hawking had initially opposed Bekenstein’s idea on the grounds that a black hole could not radiate energy and therefore could not have entropy.[9][10] However, in 1974, Hawking performed a lengthy calculation that convinced him that particles do indeed emit from black holes. Today this is known as Bekenstein-Hawking radiation. Bekenstein’s doctoral adviser, John Archibald Wheeler, also worked with him to develop the no-hair theorem, a reference to Wheeler’s saying that “black holes have no hair,” in the early 1970s.[11] Bekenstein was the first physicist to postulate such a theorem. His suggestion was proven to be unstable, but it was influential in the development of the field.[12][13]

    Based on his black-hole thermodynamics work, Bekenstein also demonstrated the Bekenstein bound: there is a maximum to the amount of information that can potentially be stored in a given finite region of space which has a finite amount of energy (which is similar to the holographic principle).[14]

    As you can see, there is a lot of physics and cosmology that developed based on Boltzman’s equations. In addition to Hawking, Pengrove and Suskin (string theory) developed theories based on Bekenstein’s work. Moral: we can’t just take these big over arching theories for granted because they can mislead science.

  46. On the one hand, Sal is telling us that the connection between entropy and information is “obvious” and “formally correct”.

    On the other, he agrees with Mike Elzinga that entropy has nothing to do with information and has characterized the missing information view as “deepity woo”.

    Has his account been hijacked, or is he really this confused?

Leave a Reply