He’s baaack

Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.

The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.

Then there’s this gem:

But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…

Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.

153 thoughts on “He’s baaack

  1. keiths:

    Knowing your attention span…

    J-Mac:

    I love you too😊
    Thanks o lot!!!

    I call ’em as I see ’em.

  2. Rumraket: Designers don’t violate the 2nd law, they are manifestations of it. To arrange those objects and parts into some complex structure, to even think the plan, the design of it up, takes the conversion of available energy into less usable forms and the total entropy is increased. The brain literally runs hot(and so does your muscles), and you have to eat to keep it all running. It’s a big flesh-computer and if you pull the plug it’s temperature equilibrates with the surroundings and your thinking(and any other kind of behavior) stops.

    Exactly. I don’t know why ID/creationists focus on biology like they do when refrigerators shouldn’t work according to their screeds. When they are able to cope with how refrigerators are able to decrease entropy within the fridge, then perhaps they could step up to biology.

  3. Rumraket: Technically he’s just flat out wrong. AFAIK statistical mechanics says computers (or any other imaginable material objects) can, in point of fact, appear on barren planets. In fact a similar question is considered a fundamental problem in physics: The spontaneous appearance of Boltzmann brains even from a state of equilibrium.

    I remember once driving with a friend through Amarillo Texas.

    I said, “Hey, looks at all those colorful Cadillacs that have formed spontaneously sticking up at of the ground, isn’t that interesting?’

    And he was like “Huh?”

    And I was like, “What?”

  4. T_aquaticus:

    When they are able to cope with how refrigerators are able to decrease entropy within the fridge, then perhaps they could step up to biology.

    Judging from his article, Sewell would presumably say that refrigerators violate the second law, and that they are able to do so because they are designed by an intelligence. He writes:

    Well, now I have to admit that I also have a scheme that I believe can defeat the generalized second law. My scheme is called “intelligence.” But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information, we can watch my scheme create spectacular amounts of order and information every day, in every writer’s office, in every inventor’s lab, and in every R&D division of every engineering firm throughout our civilization. You can even try it yourself, at home.

    The poor guy actually thinks that the second law is being violated continually on earth.

  5. keiths:
    I call ’em as I see ’em.

    Note that J-Mac couldn’t go beyond “Knowing your attention span”.

  6. keiths,

    What a delightful example! Of course, the statement that Styer ridicules, “Entropy… is why cars rust…” is absolutely correct, just not for the reason that Wedekind thinks. As others have noted on this thread, cars rust because the entropy of the surroundings increases far more than the entropy of the car declines. It’s compensation all the way…

  7. Entropy,

    Note that J-Mac couldn’t go beyond “Knowing your attention span”.

    Ha ha. I should have anticipated that, shouldn’t I?

    Here’s how I could have made my point while retaining J-Mac’s interest:

    Knowing your quantum attention span…

    🙂

  8. DNA_Jock,

    What a delightful example! Of course, the statement that Styer ridicules, “Entropy… is why cars rust…” is absolutely correct, just not for the reason that Wedekind thinks.

    Right.

    As others have noted on this thread, cars rust because the entropy of the surroundings increases far more than the entropy of the car declines. It’s compensation all the way…

    Yes. And Sewell fails to realize that by denying compensation, he is denying the second law itself. I explained this six years ago in one of his threads at UD:

    CS3,

    I’ve mentioned this a couple of times already but people (including you) haven’t picked up on it, so let me try again.

    When Granville argues against the compensation idea, he is unwittingly arguing against the second law itself.

    It’s easy to see why. Imagine two open systems A and B that interact only with each other. Now draw a boundary around just A and B and label the contents as system C.

    Because A and B interact only with each other, and not with anything outside of C, we know that C is an isolated system (by definition). The second law tells us that the entropy cannot decrease in any isolated system (including C). We also know from thermodynamics that the entropy of C is equal to the entropy of A plus the entropy of B.

    All of us (including Granville) know that it’s possible for entropy to decrease locally, as when a puddle of water freezes. So imagine that system A is a container of water that becomes ice.

    Note:

    1. The entropy of A decreases when the water freezes.

    2. The second law tells us that the entropy of C cannot decrease.

    3. Thermodynamics tells us that the entropy of C is equal to the entropy of A plus the entropy of B.

    4. Therefore, if the entropy of A decreases, the entropy of B must increase by at least an equal amount to avoid violating the second law.

    The second law demands that compensation must happen. If you deny compensation, you deny the second law.

    Thus Granville’s paper is not only chock full of errors, it actually shoots itself in the foot by contradicting the second law!

    It’s a monumental mess that belongs nowhere near the pages of any respectable scientific publication. The BI organizers really screwed up when they accepted Granville’s paper.

  9. Joe Felsenstein: Classical thermodynamics of course applies to living human beings.The fact that it does not say anything about some other issue is irrelevant to that.

    Thanks for the correction.

    Nice to see you still visiting us at TSZ. If you leave here, this site will die!

  10. Keiths,

    The Ben-Naim approach is the most generalized, but as far as measurement, it doesn’t necessarily make it “superior” because temperature and volume can be measure in the lab, ignorance can’t be measured so easily. That’s why Dr. Mike liked Clausius definition of entropy.

    Dr. Mike used the term “spreading out of energy” and for most chemical and engineering applications, it’s a good approximation of Clausius.

    So what is “superior” is in the eye of the beholder.

    Rather than using the words “information” or “ignorance”, the term “uncertainty” is a little more accurate, because it conveys what can be MEASURED in principle, not necessarily actual knowledge. IGNORANCE has connotations related to our own mental understanding, but uncertainty conveys the notion of what is measurable in principle independent of our mental faculties.

  11. phoodoo:

    Just wait till those Cadillacs get moved to a junkyard where tornadoes can get a hold of them!

  12. Fair Witness:
    phoodoo:

    Just wait till those Cadillacs get moved to a junkyard where tornadoes can get a hold of them!

    Well Rumraket will tell you, anything is possible.

    I moved to a new town recently-houses everywhere. I wonder if they were built. Could just be like evolution, who knows?

  13. Ha ha.I should have anticipated that, shouldn’t I?

    It happens all the time. I keep writing more than one sentence to the guy/gal and, of course, (s)he doesn’t even understand the only one (s)he has enough attention span to “read.” If that can be called “reading.”

  14. Fair Witness to phoodoo:

    Just wait till those Cadillacs get moved to a junkyard where tornadoes can get a hold of them!

    Especially if the tornadoes are running backwards, à la Sewell.

  15. Which just goes to show that even bright people can say stupid things at times.

  16. As I’ve argued with BruceS, and fmm, I think the Boltzmann Brain thing is nonsense. I know physicists take it seriously, but I’m no physicist! I can’t envisage the high-entropy state into which these things are supposed to plausibly pop – even by appeal to Very Long Time. If matter can barely move (e.g. a black hole), it’s not going to make ‘STP’ structures, ever. Likewise if it is diffuse.

  17. stcordova,

    Have I given you the benefit of my opinion on Hoyle’s unutterably stupid ‘747’ analogy on any of the other dozens of times you’ve brought it up, perchance?

  18. Sal,

    The Ben-Naim approach is the most generalized, but as far as measurement, it doesn’t necessarily make it “superior” because temperature and volume can be measure in the lab, ignorance can’t be measured so easily.

    You don’t have to measure ignorance in order to use the missing information interpretation of entropy. When you measure thermodynamic variables such as temperature and pressure, you automatically get missing information. After all, temperature and pressure are just averages.

    Dr. Mike used the term “spreading out of energy” and for most chemical and engineering applications, it’s a good approximation of Clausius.

    The problem is that energy dispersal fails as a definition of entropy, because there are counterexamples. The missing information interpretation, by contrast, always works.

    IGNORANCE has connotations related to our own mental understanding, but uncertainty conveys the notion of what is measurable in principle independent of our mental faculties.

    That’s why I use the phrase “missing information”, which is synonymous with “uncertainty”.

  19. keiths

    Judging from his article, Sewell would presumably say that refrigerators violate the second law, and that they are able to do so because they are designed by an intelligence.He writes:

    The poor guy actually thinks that the second law is being violated continually on earth.

    If only that were true. Just imagine what we could do. Heck, we could solve the energy crisis since we could make machines that drive without needing any energy, and could actually produce energy as they were driven. We could condense water and produce electricity at the same time, which would be great for sub-Saharan Africa.

    The absolute idiocy of thinking an intelligence can just violate the laws of thermodynamics willy-nilly is hillarious, if it weren’t so sad.

  20. keiths,

    The problem is that energy dispersal fails as a definition of entropy, because there are counterexamples.

    Do you have an example of a counterexample?

  21. keiths:

    The problem is that energy dispersal fails as a definition of entropy, because there are counterexamples.

    Allan:

    Do you have an example of a counterexample?

    Or a counterexample of an example? 🙂

    Here’s my favorite, from John Denker:

    As another example, consider two counter-rotating flywheels. In particular, imagine that these flywheels are annular in shape, i.e. hoops, as shown in figure 9.7, so that to a good approximation, all the mass is at the rim, and every bit of mass is moving at the same speed. Also imagine that they are stacked on the same axis. Now let the two wheels rub together, so that friction causes them to slow down and heat up. Entropy has been produced, but the energy has not become more spread-out in space. To a first approximation, the energy was everywhere to begin with and everywhere afterward, so there is no change.

    If we look more closely, we find that as the entropy increased, the energy dispersal actually decreased slightly. That is, the energy became slightly less evenly distributed in space. Under the initial conditions, the macroscopic rotational mechanical energy was evenly distributed, and the microscopic forms of energy were evenly distributed on a macroscopic scale, plus or minus small local thermal fluctuations. Afterward, the all the energy is in the microscopic forms. It is still evenly distributed on a macroscopic scale, plus or minus thermal fluctuations, but the thermal fluctuations are now larger because the temperature is higher. Let’s be clear: If we ignore thermal fluctuations, the increase in entropy was accompanied by no change in the spatial distribution of energy, while if we include the fluctuations, the increase in entropy was accompanied by less even dispersal of the energy.

  22. And a comment from an old thread listing six reasons why entropy cannot be a measure of energy dispersal:

    From earlier in the thread, six reasons why the dispersalists are wrong:

    1. Entropy has the wrong units. Dispersalists and informationists agree that the units of thermodynamic entropy are joules per kelvin (in the SI system) and bits or nats in any system of units where temperature is defined in terms of energy per particle. It’s just that the dispersalists fail to notice that those are the wrong units for expressing energy dispersal.

    2. You cannot figure out the change in energy dispersal from the change in entropy alone. If entropy were a measure of energy dispersal, you’d be able to do that.

    3. The exact same ΔS (change in entropy) value can correspond to different ΔD (change in dispersal) values. They aren’t the same thing. Entropy is not a measure of energy dispersal.

    4. Entropy can change when there is no change in energy dispersal at all. We’ve talked about a simple mixing case where this happens. If entropy changes in a case where energy dispersal does not change, they aren’t the same thing.

    5. Entropy change in the gas mixing case depends on the distinguishability of particles — the fact that the observer can tell ideal gas A from ideal gas B. Yet the underlying physics does not “care” about distinguishabilty — the motion of the particles is the same whether or not they are distinguishable. If the motion of the particles is the same, then energy dispersal is the same.

    The entropy change depends on distinguishability, so it cannot be a measure of energy dispersal.

    6. Entropy depends on the choice of macrostate. Energy dispersal does not.

    The way energy disperses in a system is dependent on the sequence of microstates it “visits” in the phase space. That sequence depends only on the physics of the system, not on the choice of macrostate by the observer.

  23. And:

    If entropy is a measure of energy dispersal, he [Sal] should be able to

    1) respond to each of my six points against dispersalism, including the rather obvious problem that entropy has the wrong units for energy dispersal;

    2) explain why Xavier and Yolanda see different entropy values despite looking at the same physical system with the same physical energy distribution;

    3) explain why entropy increases in Denker’s grinding wheel example, though energy dispersal does not; and

    4) explain why entropy “cares” about the distinguishability of particles, when energy dispersal does not.

  24. keiths:

    Here’s my favorite, from John Denker:

    As another example, consider two counter-rotating flywheels. In particular, imagine that these flywheels are annular in shape, i.e. hoops, as shown in figure 9.7, so that to a good approximation, all the mass is at the rim, and every bit of mass is moving at the same speed. Also imagine that they are stacked on the same axis. Now let the two wheels rub together, so that friction causes them to slow down and heat up. Entropy has been produced, but the energy has not become more spread-out in space. To a first approximation, the energy was everywhere to begin with and everywhere afterward, so there is no change.

    That’s already wrong. First, entropy has increased as the kinetic energy of rotation is transformed to heat in the flywheels and some usable energy is taken up in the exchange of energy. Next, only the surfaces of the flywheels that are touching one another will initially heat up, but that heat will spread to the rest of the flywheel and also be radiated out. That is an increase in entropy.

  25. T_aquaticus,

    Denker’s point doesn’t depend on what happens during equilibration.

    Define Zi as the initial state of the system before the hoops are in contact, and Zf as the final state after the rotation has stopped and the heat of friction has spread throughout the hoops. Thermodynamic entropy is a state variable, meaning it does not depend on the path the system takes through phase space. Thus it’s irrelevant what happens during equilibration. Zi and Zf are all that matter for determining entropy.

    The entropy associated with Zf is obviously greater than the entropy associated with Zi, yet the energy hasn’t spread out any more than before. Entropy therefore can’t be a measure of energy dispersal, as my other points also demonstrate.

  26. Energy dispersal is in PHASE space, not volume space only! GRRR! Dr. Mike and Dr. Lambert acknowledge this.

  27. Sal,

    Energy dispersal is in PHASE space, not volume space only!

    That doesn’t even make sense. At any given moment, the system occupies only one point in phase space.

  28. keiths:
    Sal,

    That doesn’t even make sense.At any given moment, the system occupies only one point in phase space.

    In Lambert’s words:
    http://entropysite.oxy.edu/cracked_crutch.html

    Increasing amounts of energy dispersed among molecules result in increased entropy that can be interpreted as molecular occupancy of more microstates.

    The microstates are found in phase space, not volume space alone. The greater the entropy, the greater number of microstates that are occupied over time for a given internal energy.

    The cold fast spinning disks have fewer microstates than the hot slow spinning disks.

  29. More from Lambert:

    The general statement about entropy in molecular thermodynamics can be: “Entropy measures the dispersal of energy among molecules in microstates. An entropy increase in a system involves energy dispersal among(2a) more microstates in the system’s final state than in its initial state.”

    If other people what to redefine what Lambert means by “dispersal”, that’s on them, but it doesn’t represent what Lambert meant by dispersal. This applies to the supposed refutation involving spinning disks.

  30. Sal,

    The microstates are found in phase space, not volume space alone.

    The system is only in one microstate at a time, meaning that it occupies a single point in phase space at any given moment. The energy — all of it — is concentrated in that single state. That’s not “dispersal”.

    Even setting that aside, you still have a problem with units. Thermodynamic entropy is measured neither in “joules per square meter” nor in “joules per microstate”. It’s measured in joules per kelvin.

    Entropy is simply not a measure of energy dispersal. It’s often associated with energy dispersal, but that’s not the same thing. Not by a long shot.

  31. Sal,

    The cold fast spinning disks have fewer microstates than the hot slow spinning disks.

    Right. More possible microstates are compatible with the high entropy macrostate than with the low entropy macrostate. In other words, it would take a greater amount of additional information to pin down the exact microstate in the high entropy state than in the low-entropy state.

    Entropy is a measure of that missing information.

  32. Sal,

    If other people what to redefine what Lambert means by “dispersal”, that’s on them, but it doesn’t represent what Lambert meant by dispersal.

    It’s not a redefinition. Lambert explicitly includes dispersal in physical space:

    How much clearer it is to say simply that if molecules can move in a larger volume, this allows them to disperse their original energy more widely in that larger volume and thus their entropy increases.

  33. I don’t think the spinning discs strictly refutes dispersal though. Before contact, you have two isolated systems. On contact, you have a single system of greater volume. The energy in A ‘flows’ into B, and vice versa. Net energy doesn’t change, but it doesn’t stay where it was either. One is anticipating contact by considering the whole system from the start, but that doesn’t seem right.

    It seems equivalent to a mixing scenario. Even with two identical gases, when a barrier exists they are two closed systems. Remove the barrier and the energy in A ‘spreads out’ geometrically, precisely compensated (when the gases are the same) by the parallel spread from B into A.

    I’m not hardline on this, but the geometric component – moving energy from a localised setting into a greater volume – seems a fundamental feature in rendering it unavailable for work.

  34. Allan,

    I don’t think the spinning discs strictly refutes dispersal though. Before contact, you have two isolated systems. On contact, you have a single system of greater volume.

    How many systems you have is a function of where you draw the boundar(ies), and where you draw the boundaries is arbitrary. The second law works no matter where the boundaries are placed, and it’s certainly possible to draw a boundary around both spinning hoops so that they form a single system.

    It seems equivalent to a mixing scenario. Even with two identical gases, when a barrier exists they are two closed systems. Remove the barrier and the energy in A ‘spreads out’ geometrically, precisely compensated (when the gases are the same) by the parallel spread from B into A.

    But the entropy doesn’t increase when the gases are identical. It only increases when they are distinguishable.

    That’s further evidence that entropy is not a measure of energy dispersal, since energy disperses in both cases but entropy increases only in one.

  35. keiths:
    Sal,

    It’s not a redefinition.Lambert explicitly includes dispersal in physical space:

    Increasing physical space increases number microstates, but so does increasing temperature! You’re inappropriately extrapolating one special case as if that’s the totality of what Lambert is describing.

    The increase in the number of microstates can happen by EITHER increase in volume and/or temperature, for example, is brutally obvious in the Sakur-Tetrode approximation for monoatomic ideal gases.

  36. Another potential objection: If thermodynamic entropy is a measure of missing information, then why is it expressed in joules per kelvin?

    I addressed that question in the earlier thread:

    Lambert’s complaint is that the equation for thermodynamic entropy includes Boltzmann’s constant (kb), while the equation for informational entropy does not. He thinks that the informationists are therefore cheating by bringing kb into the equation:

    Arbitrarily replacing k by kB — rather than have it arise from thermodynamic necessity via Boltzmann’s probability of random molecular motion — does violence to thermodynamics. The set of conditions for the use or for the meaning of kB, of R with N, are nowhere present in information theory. Thus, conclusions drawn from the facile substitution of kB for k, without any discussion of the spontaneity of change in a thermodynamic system (compared to the hundreds of information “entropies”) and the source of that spontaneity (the ceaseless motion of atoms and molecules) are doomed to result in confusion and error. There is no justification for this attempt to inject the overt subjectivity of “disorder” from communications into thermodynamic entropy.

    That’s nonsense. The only reason kb even appears in the thermodynamic entropy equation is because of the choice of units. I explained that earlier in the thread:

    The fact that it’s [thermodynamic entropy is] usually expressed in units of joules per kelvin (J/K) is an accident of history, due to the definition of the kelvin as a base unit. Had the kelvin been defined in terms of energy, joules in the denominator would have cancelled out joules in the numerator and the clunky J/K notation would be unnecessary.

    Ben-Naim makes the same point here, in the very excerpt that you quoted, walto:

    The second point is perhaps on a deeper level. The units of entropy (J/K) are not only unnecessary for entropy, but they should not be used to express entropy at all. The involvement of energy and temperature in the original definition of entropy is a historical accident, a relic of the pre-atomistic era of thermodynamics.

    Recall that temperature was defined earlier than entropy and earlier than the kinetic theory of heat. Kelvin introduced the absolute scale of temperature in 1854. Maxwell published his paper on the molecular distribution of velocities in 1859. This has led to the identification of temperature with the mean kinetic energy of atoms or molecules in the gas. Once the identification of temperature as a measure of the average kinetic energy of the atoms had been confirmed and accepted, there was no reason to keep the old units of K. One should redefine a new absolute temperature, denoting it tentatively as T, defined by T = kT. The new temperature T would have the units of energy and there should be no need for the Boltzmann constant k. The equation for the entropy would simply be S = In W, and entropy would be rendered dimensionless!

    Had the kinetic theory of gases preceded Carnot, Clausius and Kelvin, the change in entropy would still have been defined as energy divided by temperature. But then this ratio would have been dimensionless. This will not only simplify Boltzmann’s formula for entropy, but will also facilitate the identification of the thermodynamic entropy with Shannon’s information.

    Conclusion: Lambert is wrong. The use of kb is just an accident of history, caused by the fact that the kelvin is (unnecessarily) a base unit in the SI system.

    Physics obviously matters to thermodynamic entropy, but it does not get into the equations via kb. It gets in via the W in Boltzmann’s equation and via the probabilities in the Gibbs equation.

  37. Sal,

    You’re inappropriately extrapolating one special case as if that’s the totality of what Lambert is describing.

    No. I wrote:

    It’s not a redefinition. Lambert explicitly includes dispersal in physical space:

    How much clearer it is to say simply that if molecules can move in a larger volume, this allows them to disperse their original energy more widely in that larger volume and thus their entropy increases.

    Note that I said “includes”, not “includes only”.

    Either way it doesn’t help you, for reasons I’ve already given. The energy is not spread among the different microstates; it’s concentrated in a single microstate at any given instant. That’s not dispersal.

    And energy dispersal, whether in physical space or phase space, has the wrong units for entropy. “Joules per cubic meter” and “joules per microstate” are not the same as “joules per kelvin”.

    You can’t just ignore the units, Sal. That sort of sloppiness might fly in creationism, but not in science. Units matter.

    ETA: Corrected “joules per square meter” to “joules per cubic meter”.

  38. I disagree, but thanks anyway for the conversation.

    And FWIW, thanks for this thread. I’m sorry to disagree with my friend Granville, but well using 2nd law to defend ID is not wise.

  39. Sal,

    I disagree, but thanks anyway for the conversation.

    You don’t think that units matter when discussing entropy?

  40. keiths:
    Sal,

    You don’t think that units matter when discussing entropy?

    Where’s Alan Fox when you need him? At least when he was here, I didn’t have tangle with you as much as now that he’s faded way like General MacArthur.

    >A good soldier never dies, he just fades away. — Douglas MacArthur

  41. My question is a reasonable one. If Lambert’s definitions lead to the wrong units for entropy, then his definitions can’t be correct.

  42. Sal,

    We had a conversation on dimensionless UNITs here:

    You argued, bizarrely, that “joules per kelvin” is dimensionless. Which wouldn’t help your case even if it were true, since Lambert’s definitions don’t lead to dimensionless quantities.

    Lambert’s definitions don’t work, Sal. Units matter.

Leave a Reply