In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. Sal,

    That was a rather astounding amount of equivocation, bluffing, and bullshitting.

    It can all be defeated with a simple question:

    A system Z has an initial energy dispersal of D joules per cubic meter. It undergoes a process in which the entropy increases by a quantity ΔS. What is the final energy dispersal, in joules per cubic meter?

  2. Sal will be unable to answer this question without more information. Why? Because entropy is not a measure of energy dispersal — it’s a measure of missing information.

  3. Since entropy is extensive (that means the entropy of system is the sum of the entropies of the individual particles).

    The energy dispersal can be defined by the Energy E (or U), the Volume V, and the number of particles (molecules) N of the system.

    The average energy per particle in the previous comment in the intial configuration of the helium in one cubic meter is:

    3741.50691 Joules/ (6.022140857 x 10^-23) = 6.2129 x 10^-21 J per molecule

    the volume occupied (alloted) on average to each particle is:

    1 m^3 / (6.022140857×10^-23) = 2.7574 x 10^-48 m^3

    So the energy dispersal is show as:

    Energy per particle = 6.2129 x 10^-21 J
    and
    Volume per particle = 2.7574 x 10^-48 m^3

    So we can see the average entropy of a single particle using this dispersal data:

    U_per_molecle = 6.2129 x 10^-21 J
    V_alloted_per_molecule = 2.7574 x 10^-48 m^3

    We can use Sakur Tetrode then for N_single_molecule = 1

    plugging these numbers into the Sakur-Tetrode equation we get the specific entropy per particle.

    S_per_molecule = 2.6064 x 10^-22 J/K/molecule

    Since entropy is extensive (the sum of individual entropies per particle is the total entropy of the sytem), we merely multiply the specific entropy by the number particles.

    S_initial = N_total_particles x 2.6064 x 10^-22 J/K/particle =

    6.0221408757 x 10^23 particles x 2.6064 x 10^-22 J/K/particle =

    156.963 J/K

    which is the agrees with the results in the previous comments.

    A similar calculation can be done for the final case when the helium is expanded to the final volume of 2 cubic meters. The result is:

    S_final = 162.726 J/K

    This shows that the energy dispersal is an essential qualitative description of the entropy, and the challenge is making the qualitative description into numbers (in ways like I’ve shown).

    Thus Keiths and DNA_jock are shown again to be wrong.

  4. Mung,

    Why is that hopeless? Surely mathematics (and statistical mechanics) has ways of dealing with such numbers.

    Sure, mathematics can deal with absurdly large numbers and infinite dimensions but does that help up us really understand the second law or entropy. Where is the experiment that can validate the mathematics of essentially infinite microstates?

  5. Keiths,

    Here is the challenge again. Show change of entropy when 500 copper pennies go from 298K to 373K using your methods of ignorance without reference to the energy dispersed from the external world into the copper pennies.

    Let the reader note, Keiths’ inability to do this simple calculation when he avoids the essential data needed to solve the problem. And what is the essential data? it’s the amount of energy dispersed from the environment into the 1555 gram copper block. But Keiths’ can’t use it because he insists energy dispersal is a misconception. Poor Keiths.

    For all of Keiths’ bloviating, he needs to use the amount of energy dispersed from the environment into the copper block to figure out how ignorant he is, and once he figures out how ignorant he is, he uses the measure of the ignorance to calculate change in thermodynamic Entropy.

    Vindicate yourself, Keiths, show the readers a derivation with your energy-free analysis of entropy. Work the results out like you think a college chem student should work them out.

    You’re relying on too much woo (of observer dependent ignorance), and not enough glue (of rigor). That’s why your arguments don’t hold.

    Too much woo, and not enough glue, Keiths. Too much woo, and not enough glue.

  6. I’m still not a believer in the “observer-dependent” view of thermodynamic entropy, and if we’re not talking about thermodynamic entropy I think we’re equivocating when we speak of entropy.

    I’m not a believer in the “observer-dependent” view of thermodynamic entropy because I think that, as Lambert says, the probability distributions of thermodynamics to which the Shannon measure of information can be applied are objective, that the SMI calculated will be the same no matter who does the calculation because the probability distribution doesn’t change based on the observer, therefore the entropy will be the same.

    As for Sal, there’s no reason to adopt has “spreading” metaphor just because thermodynamics involves energy, which is what his latest set of mockings reduce to. He’s just being silly.

    Sal appears to have learned thermodynamics a certain way and that’s the only way it can be learned. He’s off in the trees with his calculations and formulas and can’t see the forest for the trees.

    I’ve identified two modern textbooks that use Shannon Information, so his comment in the OP:

    “Where are the order/disorder and “information” in that?”

    My answer: Nowhere!

    Well, it’s just plain wrong.

  7. Sal,

    Your desperation is showing.

    Here is my simple question again:

    A system Z has an initial energy dispersal of D joules per cubic meter. It undergoes a process in which the entropy increases by a quantity ΔS. What is the final energy dispersal, in joules per cubic meter?

    As I said:

    Sal will be unable to answer this question without more information. Why? Because entropy is not a measure of energy dispersal — it’s a measure of missing information.

  8. keiths:

    He [Damon the Demon] plugs in the number of possible microstates, which is one. (Remember, he knows the microstate exactly.) He gets an entropy of zero.

    You claim he’s wrong. Why?

    walto:

    Because the number of “possible microstates” is not a function of what Damon knows.

    Yes, it is. As I explained earlier in the thread, we’re talking about epistemic possibility, not metaphysical possibility.

    The system is in one and only one microstate at any given moment. If the W in Boltzmann’s equation represented metaphysically possible microstates, as you claim, then its value would always be one, and entropy would always be zero — for every observer, not just Damon.

    He [Lambert] also writes:

    A microstate is the quantum mechanical description of the arrangement at a given instant of all motional energies of all the molecules of system on the energy levels that are accessible to those energies.

    That is not a function of what anybody, even a Laplacian genius, knows.

    The microstate is not a function of what the observer knows. It’s a fact about the system itself. It’s the macrostate that is a function of what the observer knows, and entropy depends on the macrostate, not the microstate.

    In Damon’s case, he knows the microstate exactly, so his macrostate corresponds to one and only one microstate. For human observers, the macrostate corresponds to zillions of microstates.

    Lambert:

    It has been said that an information “entropy” equation–compared to those for thermodynamic entropy–may look like a duck but, without any empowering thermal energy, it can’t quack like a duck or walk like a duck.

    Thermodynamics involves thermal energy. One might interject “Duh!” here.

    That doesn’t make entropy a measure of energy dispersal any more than it makes temperature, or efficiency, or work such a measure.

  9. keiths: Here is my simple question again:

    A system Z has an initial energy dispersal of D joules per cubic meter. It undergoes a process in which the entropy increases by a quantity ΔS. What is the final energy dispersal, in joules per cubic meter?

    As I said:

    Sal will be unable to answer this question without more information. Why? Because entropy is not a measure of energy dispersal — it’s a measure of missing information.

    It does not follow from the fact (if it is a fact) that Sal needs additional information to answer your question that entropy is a measure of missing information. That’s a simple fallacy.

  10. walto,

    It does not follow from the fact (if it is a fact) that Sal needs additional information to answer your question that entropy is a measure of missing information.

    Of course not. That’s established by other means. But it does follow that entropy is not a measure of energy dispersal.

  11. colewd: Where is the experiment that can validate the mathematics of essentially infinite microstates?

    We’re not counting each individual microstate, and many experiments have been performed that validate the mathematics.

  12. Mung,

    I’m not a believer in the “observer-dependent” view of thermodynamic entropy because I think that, as Lambert says, the probability distributions of thermodynamics to which the Shannon measure of information can be applied are objective, that the SMI calculated will be the same no matter who does the calculation because the probability distribution doesn’t change based on the observer, therefore the entropy will be the same.

    If there were only one way to define a macrostate, you would be correct. The choice of macrostate does indeed determine the possible microstates and their probabilities.

    However, it simply isn’t true that there’s only one way to define a macrostate. In my Xavier/Yolanda/Damon example, each observer defines their macrostate differently.

    If you disagree, then explain why Xavier’s macrostate isn’t a legitimate macrostate. Or Yolanda’s, or Damon’s, depending on who you think is using an illegitimate one.

  13. In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification. Understanding the role of thermodynamic entropy in various processes requires an understanding of how and why that information changes as the system evolves from its initial to its final condition.

    https://en.wikipedia.org/wiki/Entropy

    It is often said that entropy is an expression of the disorder, or randomness of a system, or of our lack of information about it.

    They forgot to mention “spreading” energy, like butter on bread.

  14. keiths: However, it simply isn’t true that there’s only one way to define a macrostate. In my Xavier/Yolanda/Damon example, each observer defines their macrostate differently.

    Are they macrostates of a thermodynamic system at equilibrium, and do they involve a probability distribution that is a probability distribution of thermodynamics?

    If not, we are probably not talking about the same thing.

  15. Mung, quoting Wikipedia:

    In the modern microscopic interpretation of entropy in statistical mechanics, entropy is the amount of additional information needed to specify the exact physical state of a system, given its thermodynamic specification.

    Bingo.

  16. And the observer-dependence comes from the fact that there is not just one “correct” thermodynamic specification. Yolanda’s, Xavier’s and Damon’s thermodynamic specifications — their macrostates — are all legitimate.

  17. Molecular Driving Forces
    Chapter 1. Principles of Probability
    Section 1. The Principles of Probability Are the Foundations of Entropy

    The concepts that we introduce in this chapter, probability, multiplicity, combinatorics, averages, and distribution functions, provide a foundation for describing entropy.

    My question: Why is Salvador avoiding talking about probabilities and distribution functions?

    My answer: To do so is to give away the game, for those are the concepts of information theory.

  18. Mung,

    Are they macrostates of a thermodynamic system at equilibrium…

    Yes.

    …and do they involve a probability distribution that is a probability distribution of thermodynamics?

    Yes, in the following sense. Once a “style” of macrostate has been chosen and the appropriate variables measured (or stipulated, in a theoretical case), the possible microstates and their probabilities are determined by the physics alone, not by the observer.

  19. Since Keiths is having a problem solving even a trivial thermodynamic problem, I’ll give him and even more trivial (trivialer) problem from a Chemistry exam at Simon Frasier University in Canda.

    http://www.sfu.ca/~mxchen/phys3441121/Mt1Solution.pdf

    QUESTION:

    We have a 20 g cube of ice at 273 Kelvin (melting temperature). The latent heat of melting is 333 grams/Joule. What is the entropy change (neglecting the slight change in volume from solid to liquid phase).

    SOLUTION:

    T = 273 K

    Hmm, so how much energy is dispersed from the surroundings into the ice cube?

    Delta-q = Energy crossing thermodynamic barrier between environment and ice cube.

    delta-q = 333 grams x 20 J/ gram = 6660 Joules

    What is the entropy change. Since the process is isothermal, the Clausius integral evaluates trivially to:

    delta-S = delta-q / T = 6660 J /273 K = 24.4 J/K

    So 6660 Joules of energy gets dispersed from the environment into the ice cube. We just have to scale it down by the temperature to get the quantitative change in entropy of 24.4 J/K.

    So the energy dispersal description works in understanding entropy.

    Now Keiths can answer this simple college level chem question using his observer-dependent ignorance measurements and also not reference the energy dispersal of 6660 J going into the ice cube since he says energy dispersal is a misconception.

    He needs to show his work too.

  20. More bluffing and diversion (or should I say ‘dispersion’?) from Sal. He knows he’s on the ropes.

    I repeat:

    Sal,

    Your desperation is showing.

    Here is my simple question again:

    A system Z has an initial energy dispersal of D joules per cubic meter. It undergoes a process in which the entropy increases by a quantity ΔS. What is the final energy dispersal, in joules per cubic meter?

    As I said:

    Sal will be unable to answer this question without more information. Why? Because entropy is not a measure of energy dispersal — it’s a measure of missing information.

  21. The psychology of denial is fascinating.

    Instead of making a “report” to his friends in the ID community, Sal should just link to this thread. They deserve to see what actually happened here, unfiltered by Sal’s Trumpian spin doctoring.

  22. Mung: https://en.wikipedia.org/wiki/Entropy

    They forgot to mention “spreading” energy, like butter on bread.

    That article also has this:

    The second law of thermodynamics states that an isolated system’s entropy never decreases. Such systems spontaneously evolve towards thermodynamic equilibrium, the state with maximum entropy. Non-isolated systems may lose entropy, provided their environment’s entropy increases by at least that decrement. Since entropy is a state function, the change in entropy of a system is determined by its initial and final states. This applies whether the process is reversible or irreversible. However, irreversible processes increase the combined entropy of the system and its environment.

    Can you explain the necessary connection between missing information and the expectation that in isolated systems entropy is always increasing? Why should that be the case? Or is the 2nd Law just wrong?

  23. A couple of examples of Sal’s bullshitting and equivocation:

    Now Keiths can answer this simple college level chem question using his observer-dependent ignorance measurements and also not reference the energy dispersal of 6660 J going into the ice cube since he says energy dispersal is a misconception.

    I don’t claim that “energy dispersal is a misconception”, as Sal knows perfectly well. Energy dispersal is a real and measurable phenomenon. The misconception is to regard entropy as a measure of energy dispersal.

    It isn’t, which is why Sal can’t answer my simple question above.

    Second example: Earlier he calculated some entropies, getting results in units of J/K, then wrote:

    And you end up with the right units. Dispersal of energy at various temperatures, just a Lambert said:

    Entropy change measures the dispersal of energy: how much energy is spread out in a particular process, or how widely spread out it becomes (at a specific temperature).

    Dispersal of energy is dispersal of energy in space. Sal knows this, though he can’t admit that. The units are wrong.

    Since we’re talking about dispersal in space, the correct units are J/m^3 (joules per cubic meter),

    Entropy is expressed either as J/K (joules per kelvin) if we’re using SI units, or bits (or nats) if we are using a system where temperature is defined in terms of energy.

    Joules per kelvin are not joules per cubic meter.

    Bits (and nats) are not joules per cubic meter.

    You cannot determine energy dispersal from an entropy value without additional information. Entropy is not a measure of energy dispersal, and even the units are wrong.

  24. keiths: You cannot determine energy dispersal from an entropy value without additional information. Entropy is not a measure of energy dispersal, and even the units are wrong.

    So you need an additional input. It’s not ignorance–or any function of some quantity of ignorance.

  25. Keiths:

    Here is my simple question again:

    A system Z has an initial energy dispersal of D joules per cubic meter

    You’re equivocating the meaning of dispersal Keiths, the dispersal data must include dispersal of energy on the number of particles, not just the energy per cubic meter because energy is dispersed onto particles.

    So energy is dispersed on the particles and the energy on the particles is dispersed in a volume, ergo the energy is dispersed in a volume and on particles. You, just focus on dispersal of energy in a volume.

    For gases, something like this describes the dispersal of energy onto particles and dispersal of particles (therefore the energy in the particles) in a volume.

    U_per_molecle = 6.2129 x 10^-21 J
    V_alloted_per_molecule = 2.7574 x 10^-48 m^3

    Then I showed the entropy with this dispersal for an ideal monoatomic gas with this dispersal profile is:

    S_per_molecule = 2.6064 x 10^-22 J/K/molecule

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    If you bothered to study it, you’d see again you’re equivocating the meaning of energy dispersal. You get called on the equivocation and you make more. What’s the deal?

    Look again at the way I describe energy dispersal in that case:

    U_per_molecle = 6.2129 x 10^-21 J
    V_alloted_per_molecule = 2.7574 x 10^-48 m^3

    If I used your equivocation for energy dispersal to calculate entropy it would be:

    Keiths_dispersal = 3741.5 J / m^3

    Is that the way I stated energy dispersal for that situation? No. That’s your equivocation.

    You’re playing the same Keiths game again. Interpret words and phrases of Lambert and other chemists and physicists according to Keiths equivocations, not according to the way Lambert and other chemists and physicists interpret their own words.

  26. walto,

    So you need an additional input. It’s not ignorance–or any function of some quantity of ignorance.

    It doesn’t need to be. The missing information is already represented by the entropy.

  27. Sal,

    You’re equivocating the meaning of dispersal Keiths,

    No, you are.

    Earlier in the thread you were quite happy to express energy dispersal in joules per cubic meter. For example, you wrote:

    This describes how the energy of Gas A becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 3741.3 J/cubic meter to 1870.65 J/cubic meter. Likewise the energy of Gas B becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 6235.5 J/cubic meter to 3117.75 J/ cubic meter.

    Now, suddenly, you’re claiming that energy dispersal has units of joules per kelvin. Why? It’s obvious. You’re equivocating in a futile attempt to rescue your untenable position. It’s blatant and quite sleazy.

    Entropy is not a measure of energy dispersal. It has the wrong units.

  28. me: So you need an additional input. It’s not ignorance–or any function of some quantity of ignorance.

    Keiths: It doesn’t need to be. The missing information is already represented by the entropy.

    And thus was the circle made complete.

  29. It isn’t circular reasoning, walto. Draw it out on paper if you need to.

    If you can’t figure it out, I’ll explain later. I’ve got to run right now.

  30. Keiths:

    Dispersal of energy is dispersal of energy in space.

    That’s the Keiths equivocation and mis-interpretation of what Lambert and others are saying.

    Dispersal of energy is dispersal onto particles AND the dispersal of energy in space through the energy bearing particles. The state of dispersal is not defined by one quantity.

    When I throw heat at an ideal gas in a container, I disperse the heat energy onto the particles, not just the space of the container. The energy per particle increases. Is that too hard for you to comprehend Keiths?

    That’s the intended interpretation by Lambert in my monoatomic gas example. I managed to figure this out. You obviously have not, since you want to save face, will insist on mis-interpreting Lambert’s words rather than let Lambert interpret himself.

    These two aspects of dispersal are accounted for as in the case of an ideal monatomic gas example I gave and the dispersal can be quantitatively defined as by:

    U_per_molecule = Energy_per_molecule = 6.2129 x 10^-21 J
    Energy_bearing_molecules_per_cubic_meter = 1 / (2.7574 x 10^-48 m^3 )

    which for the sake of fitting it in the Sakur Tetrode, I stated as:

    U_per_molecle = 6.2129 x 10^-21 J
    V_alloted_per_molecule = 2.7574 x 10^-48 m^3

    With this information alone, I could calculate the entropy as:

    S_per_molecule = 2.6064 x 10^-22 J/K/molecule

    You on the other hand want me to use your definition of dispersal which only has one parameter, and this is what it looks like:

    Keiths_dispersal = 3741.5 J / m^3

    Does that look like anything of the way I describe dispersal? No. That’s Keiths revisionism of how other people define their own phrases and words and concepts.

  31. Now that I just called out keiths on his equivocations and his fallacious challenges, maybe he can show the readers how he answers a simple entropy question about an ice cube on a chemistry exam at Frasier University using his patented methods of using observer-dependent ignorance to compute entropy.

    Ok Keiths, answer the professor’s question with your patented methods of ignorance:

    http://www.sfu.ca/~mxchen/phys3441121/Mt1Solution.pdf

    QUESTION:

    We have a 20 g cube of ice at 273 Kelvin (melting temperature). The latent heat of melting is 333 grams/Joule. What is the entropy change (neglecting the slight change in volume from solid to liquid phase).

    The answer is 24.4 J/K, but you’ll have to show your work, otherwise you’re cheating.

    Remember also, you can’t use the energy dispersed from the environment into the ice cube to calculate your answer since you said energy dispersal is a misconception of entropy. You obviously have another way of calculating entropy. So show it.

    Or are you finally going to cave and admit the amount of energy dispersal from environment to ice cube is an essential part of answering the exam question?

  32. For the reader’s benefit, in light of Keith’s equivocations, I wish to try to add some clarifications.

    A qualitative way to understand the 2nd law is to note that heat from a hot object tends to flow naturally to a cold object unless we do something to hinder or reverse this tendency (like have air conditioner make the heat from a cold house go to the hot environment on a summer day). Engineers and scientists needed the concept of entropy to help the build steam engine and air conditioners and to do chemistry.

    Energy in the form of heat spreads (disperses) from the hot object into the cold object, or a hot environment into a cold object. Or conversely, a hot object spreads (disperses) its energy into a cold environment.

    As heat spreads into a cold object, the cold object’s entropy increases.

    There are 3 major case examples in this discussion so far.

    The simplest illustration of entropy increase is the ice cube melting. Heat from the environment spreads (disperses) into the ice cube. It is the simplest case because the process is isothermal (same temperature) at 0 degrees Celcius (273K) and approximately isochoric (same volume of ice/water) and isobaric (same pressure).

    As in the Frasier University chemistry exam, for 20gram ice cube, when 6660 Joules of energy enter the ice cube at 273 K. The solution to the Clausius integral is relatively straight forward.

    Integral (dq/T) = 6660/ 273K = 24.4 J/K

    Btw, 6660 Joules of energy is like the energy expended by a 100 watt light bulb running for 66.6 seconds.

    In the case of the 1555 copper block going from 298K to 373K in temperature. Energy from the environment or heater spreads (disperses) into the copper block. Since the temperature increases as 45,484 Joules of energy are pumped (dispersed) into the copper block, the Clausius integral is modestly more difficult to calculate, but there are canned formulas to help the student in that case. The student just needs to recognize which formula to use, otherwise he has to derive the formula from first principles.

    The Clausius integral for that case:

    Integral (dq/T) = C ln ( T_final/T_initial)

    where
    C = Heat Capacity
    T_final = final temperature
    T_initial = initial temperature

    for the case of copper I worked out the change in entropy as:
    136.1 J/K

    See:

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    A substantially more difficult case is the case of an ideal gas because not only can we pump heat energy into a gas, we can also spread the gas outward through expansion or mixing with another gas.

    Dispersal or spreading of energy into the gas, without changing the volume will increase entropy of the gas. However, increasing the volume without adding (kinetic) energy to the gas molecules will also change the entropy. Spreading out the gas without changing the energy of the gas molecules (as in expanding it without changing the temperature), spreads out the kinetic energy and with some work it can be seen it’s entropy increases.

    So for the case of the gas, “dispersal” of energy has two meanings. Dispersal or spreading of energy into the gas, and dispersal of the gas (and its energy) over a larger volume (at the same temperature). Either of these two modes of energy dispersal will increase entropy. The complexity of this situation can be seen in the solutions I provided for entropy change here:

    In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

    In the more complex case of ideal monatomic gases, dispersal of energy among the individual particles must be accounted for, and dispersal of energy in the volume which the gas occupies must be accounted for. It is not as simple as Keiths equivocation of defining the dispersal as energy divided by volume. He relied on equivocating the intended meaning of the phrases used by Lambert and others (like Dr. Mike).

    I’ve provided the procedures using the qualitative notion of energy dispersal as a pedagogical aide in understanding the change in entropy in 3 cases.

    Keiths has yet to apply his observer-dependent, “ignorance is entropy” method to independently arrive at the answers I provided in the 3 cases.

    Worse, he cannot even begin to do his ignorance computation if he avoids accounting for the most essential ingredient in the analysis, namely the way energy is distributed (dispersed) in the system in question. Likewise for DNA_jock.

  33. Lol. I see Sal has been digging in my absence. A couple of quick points before bed.

    Sal claims I’ve been equivocating on the meaning of ‘energy dispersal’. Let’s see about that.

    How many ways have I used the term in this thread? One, and the units have always been joules per cubic meter. (The equilibrium energy distributions have all been uniform.)

    How many ways has Sal used the term in this thread? At least four, and the units have been

    1) joules per cubic meter,
    2) joules per cubic meter per particle,
    3) bits,
    4) joules per kelvin, and
    5) joules per kelvin per particle.

    So I’ve stuck to a single meaning while Sal has used at least four. Yet according to Sal, I’m the one who is equivocating. How does that work, exactly?

  34. Second point. I claim that entropy is a measure of the missing information regarding a system’s microstate, not a measure of energy dispersal.

    Sal inanely takes this to mean that I’m not allowed to use energy in my calculations:

    Vindicate yourself, Keiths, show the readers a derivation with your energy-free analysis of entropy. Work the results out like you think a college chem student should work them out.

    That’s ridiculous, of course. Energy is central to thermodynamics, and so of course anyone doing thermo will take energy into account. I didn’t say that energy had no place in thermo; I said that entropy isn’t a measure of energy dispersal.

    It’s as if I were asserting that a car’s range is a measure of distance, in units of miles, with Sal retorting “Oh, yeah? Then you should be able to calculate the range without taking the gas tank capacity and mpg into account.”

    It Does Not Follow.

  35. Also, this is funny.

    Sal writes:

    Keiths_dispersal = 3741.5 J / m^3

    Does that look like anything of the way I describe dispersal? No.

    Um, yes:

    This describes how the energy of Gas A becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 3741.3 J/cubic meter to 1870.65 J/cubic meter.

  36. keiths:

    Do you think the concept of degrees of freedom is similarly observer-dependent? For example, do you think that rigid bodies have fewer than six degrees of freedom in three dimensions if we know which way it has to move given the laws of physics (i.e., one only degree only for Damon), and, perhaps, more than six if we don’t know anything whatever–say from the perspective of a turtle? Does a coin have, from Damon’s POV, a possibility space of one or two with respect to landing H or T?

  37. Mung,

    We’re not counting each individual micro state, and many experiments have been performed that validate the mathematics.

    Do you have a reference to any? What practical problem can you solve with the description of entropy that you cited in Wiki?

  38. colewd:
    Mung,

    Do you have a reference to any?What practical problem can you solve with the description of entropy that you cited in Wiki?

    I don’t know anything about the ID or creationism angle here (and am not too interested), but it seems to me that if you strain that contention away, the more one thinks about this issue, the more comical the dispute gets. (Which is why I think Prof. Moran simply shrugged his shoulders at the OP.)

    I think I can see now, anyhow, why keiths and Jock are reluctant to give up the word “entropy” and would prefer to have chemists make up their own term for what THEY use, say, “c-entropy” After all, if one has got Boltzmann, and he’s the granddaddy of the concept, why should one yield the term?

    Sal’s interesting (linked-to above) piece re the relations of the various equations, as well as about 50 wikipedia articles about the history of the term are also informative on this issue, I think. In a word, Boltzmann made this little, elegant equation defining entropy, but it has to be converted to more complicated equations to be useful in the physical sciences (though perhaps its fine as is for information science). So the thing is to look at those conversions. One of them requires the use of degrees of freedom, which is why I brought it up above.

    I believe, though I’m happy to be corrected, that if degrees-of-freedom is taken as an observer-relative concept, the derivations of equations that are useful to chemists can’t be made. OTOH, if it’s taken as absolute, then, given an observer-relative (and arguably Boltzmannian) notion of entropy, the equations won’t be equivalent to Boltzmann’s original.

    So, in the end, it seems to me that this is nothing but a fight over who gets to use the word “entropy.” Lambert says in effect, “I want the word for what chemists and others have long been doing with what THEY call it–and it’s not a measure of disorder or lack of information.” Sal agrees with that perspective. But keiths, Jock and others want the word for what Boltzmann and Shannon meant by it, and if the derivations to what physical scientists use don’t work–tough nougies on them: they can get their own term.

    Anyhow, that’s how I currently see this kerfluffle. However, if there’s anything in the book that mung cited above regarding the derivablility of the 2nd Law from the Boltzmann version that I can understand (which is doubtful, I’m afraid), I hope he will post it. My layman’s guess is that the claimed derivation will again require the use of some OTHER equation that’s not actually provably equivalent to Boltzmann’s, which arguably requires an observer-relative count of possible microstates, just as keiths has claimed.

    keiths asks, which is the REAL chance of such-and-such happening, if it is not that calculated by Xavier, Yolanda, or Damon. And, of course, from that angle there is no answer: disorder is relative to some concept of order. But from another angle, with coin flips there remain H and T, two possibilities–even if one knows to a moral certainty that H will come up. And that concept is necessary to get anything like what Lambert wants to have available.

    So I now want to change my vote to abstain, in the “Who gets the name?” game. I don’t care. (Again, I have no clue how this matters to arguments about evolution and design, but my sense is that it shouldn’t.)

    Thus it was spoken by the child, and thus it was verily written.

  39. stcordova: You’re equivocating the meaning of dispersal Keiths, the dispersal data must include dispersal of energy on the number of particles, not just the energy per cubic meter because energy is dispersed onto particles.

    So energy is dispersed on the particles and the energy on the particles is dispersed in a volume, ergo the energy is dispersed in a volume and on particles. You, just focus on dispersal of energy in a volume.

    LoL. Hi Sal, I’m sure you’re best buddies with David Snoke. Run this by him. Ask him to come visit this thread.

  40. In thermodynamics, the internal energy of a system is the energy contained within the system, including the kinetic and potential energy as a whole. It keeps account of the gains and losses of energy of the system that are due to changes in its internal state.

    If the containing walls pass neither matter nor energy, the system is said to be isolated and its internal energy cannot change. The first law of thermodynamics may be regarded as establishing the existence of the internal energy.

    The internal energy is one of the two cardinal state functions of the state variables of a thermodynamic system.

    https://en.wikipedia.org/wiki/Internal_energy

  41. stcordova: When I throw heat at an ideal gas in a container, I disperse the heat energy onto the particles, not just the space of the container. The energy per particle increases.

    🙂

    You mean the number of particles don’t actually increase?

    ETA; Because if you had more particles there would be more to disperse (assuming they weren’t already dispersed, which they probably were)!

  42. keiths: So I’ve stuck to a single meaning while Sal has used at least four. Yet according to Sal, I’m the one who is equivocating. How does that work, exactly?

    You’re equivocating because you have stuck to one meaning when you should be using at least five different meanings, like Sal.

    It’s like insisting on using one meaning of entropy when in fact there really are dozens of meanings of entropy (or at least five).

  43. walto,

    I don’t know anything about the ID or creationism angle here (and am not too interested), but it seems to me that if you strain that contention away, the more one thinks about this issue, the more comical the dispute gets. (Which is why I think Prof. Moran simply shrugged his shoulders at the OP.)

    I don’t think anyone here supports the creationist angle but I may be mistaken since I am not really sure what it is. I think sorting out the second law is very important for physics and cosmology. The contention going on with this blog is similar to whats going on in the scientific community in general. I personally am confused by the subject and hope this discussion will yield more clarity. I think discussing the third law which describes entropy at absolute zero may help.

  44. I am in no real position to judge, but walto’s perspective sounds promising.

    One issue is whether the two views (Lambert vs. Ben-Naim) make different predictions about any other process. For that matter, how often is any “entropy” an ex post facto rationalization, after one has determined by other means what will actually happen.

    Having exposed my ignorance, let me fall silent for a while.

  45. walto:

    I think I can see now, anyhow, why keiths and Jock are reluctant to give up the word “entropy” and would prefer to have chemists make up their own term for what THEY use, say, “c-entropy”

    Emphatically no.

    The thermodynamic entropy of the chemists is the same as the thermodynamic entropy of the “missing informationists”, and it’s governed by the same equations. There’s no need for a new name.

    The problem is not that two concepts are “competing” for the same name. It’s simply that Lambert, Elzinga, Sal, and everyone else who buys into the idea that entropy is a measure of energy dispersal is mistaken.

    The deep irony is that Lambert, whose mission is to root out the “entropy is disorder” misconception, has replaced it with another misconception.

Leave a Reply