He’s baaack

Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.

The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.

Then there’s this gem:

But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…

Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.

153 thoughts on “He’s baaack

  1. “The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument”

    What’s the compensation argument exactly?

  2. J-Mac,

    What’s the compensation argument exactly?

    It’s the argument that any decrease in the entropy of a system is necessarily accompanied by a compensating entropy increase in the surroundings, so that the second law of thermodynamics is not violated.

    Sewell wants to argue that evolution violates the second law, so he hates the compensation argument and tries (endlessly) to show that it’s wrong.

  3. It’s one of many. The guy can’t stop repeating himself, no matter how often he gets refuted.

  4. Here’s Sewell’s strawman version of the compensation argument:

    The fact that entropy can decrease in an open system does not mean that computers can appear on a barren planet as long as the planet receives solar energy.

    As if any of his critics were actually making that claim.

  5. Sewell attacks the compensation argument by implying that it involves flows of energy distant from the decrease of entropy. In the case of biological evolution that is not true: proportions of genotypes change in a population, as a result of births and deaths in that population, and the compensating flow of energy is through those individuals. The Second Law is not violated by increase of more fit individuals in a population.

    He also invokes his impressive-looking equations for movement of substances, implying that they endlessly disperse. However his equations are simple diffusion equations with no terms allowing different substances to interact. He leaves out chemistry and all forms of physical interaction, as a close study of the terms in his equations will disclose. In a glass of salty water, the water will ultimately evaporate and the salt will crystallize at the bottom of the glass. His equations do not predict that.

  6. The problem starts the very moment in which entropy is equated with disorder. That’s not entropy.

    Then, the compensation “argument” is not an argument, it’s a fact of physics. “Local” reductions in entropy are accompanied by much higher increases in entropy in the “surroundings.” Not only that, it’s that increase in entropy that drives the “local” reduction in the first place. We decrease our “local” entropy by transforming highly available energy sources into much less available energy forms (like heat).

    Even the creationists preferred “solution” depends on the very things they try and deny. On the very things they imagine that intelligence can oppose (it cannot). No brain power, no thinking, no planing, no movement, no building of computers or cars or jumbo jets, can happen without those energy transformations, without that “compensation.” If energy didn’t have that tendency to flow from high-to-low availability, then work could not be possible. Entropy is not the problem, entropy is the solution. Entropy drives the whole endeavour.

  7. He’s baaack

    For a fleeting moment, I thought you were talking about Mr. Fishing Reels and word salads himself, KairosUnFocused.

  8. keiths: It’s the argument that any decrease in the entropy of a system is necessarily accompanied by a compensating entropy increase in the surroundings, so that the second law of thermodynamics is not violated.

    How exactly is the environment compensating for the loss of entropy?

  9. Entropy: The problem starts the very moment in which entropy is equated with disorder. That’s not entropy.

    I tried to explain this to him a long time ago. To be fair many on our side dont realize that the compensation argument is irrelevant. The fact is that complexity within a system can increase even as the entropy increases and even if it is a closed system. If you dont see that just consider that while the earth is an open system, the solar system is essentially a closed system and life arose within it.

  10. J-Mac,

    How exactly is the environment compensating for the loss of entropy?

    The system “exports” entropy to its environment. If the amount exported is large enough, the entropy of the system will decrease.

    The exact way in which this happens depends on the nature of the system and its surroundings.

  11. keiths: Here’s Sewell’s strawman version of the compensation argument:

    The fact that entropy can decrease in an open system does not mean that computers can appear on a barren planet as long as the planet receives solar energy.

    As if any of his critics were actually making that claim.

    Technically he’s just flat out wrong. AFAIK statistical mechanics says computers (or any other imaginable material objects) can, in point of fact, appear on barren planets. In fact a similar question is considered a fundamental problem in physics: The spontaneous appearance of Boltzmann brains even from a state of equilibrium.

  12. A comment I made at UD in a discussion of a paper of Sewell’s entitled Entropy, Open Systems and Evolution:

    Timaeus,

    Scientific papers are judged by their contents. The contents of Granville’s paper are awful. Based on those contents, and using Granville’s own words, I have shown that Granville:

    1. Mistakenly asserts that “the increase in order which has occurred on Earth seems to violate the underlying principle behind the second law of thermodynamics, in a spectacular way.”

    2. Titles his paper Entropy, Evolution and Open Systems without realizing that the second law is actually irrelevant to his improbability argument, since it is not violated by evolution.

    3. Misunderstands the compensation argument and incorrectly rejects it.

    4. Fails to understand that the compensation argument is a direct consequence of the second law, and that by rejecting it he is rejecting the second law itself!

    5. Fails to realize that if the compensation argument were invalid, as he claims, then plants would violate the second law whenever their entropy decreased.

    6. Asserts, with no evidence, that physics alone cannot explain the appearance of complex artifacts on Earth.

    7. Offers, as evidence for the above, a thought experiment involving a simulation he can neither run nor analyze.

    8. Declares, despite being unable to run or analyze the simulation, that he is “certain” of the outcome, and that it supports his thesis.

    9. Confuses negentropy with complexity, as Lizzie explained.

    10. Conflates entropy with disorder, as Lizzie explained.

    Granville was unable to defend his paper, so he bailed out of the thread.

  13. Entropy: Even the creationists preferred “solution” depends on the very things they try and deny. On the very things they imagine that intelligence can oppose (it cannot). No brain power, no thinking, no planing, no movement, no building of computers or cars or jumbo jets, can happen without those energy transformations, without that “compensation.” If energy didn’t have that tendency to flow from high-to-low availability, then work could not be possible. Entropy is not the problem, entropy is the solution. Entropy drives the whole endeavour.

    This. If only people could understand this. Designers don’t violate the 2nd law, they are manifestations of it. To arrange those objects and parts into some complex structure, to even think the plan, the design of it up, takes the conversion of available energy into less usable forms and the total entropy is increased. The brain literally runs hot(and so does your muscles), and you have to eat to keep it all running. It’s a big flesh-computer and if you pull the plug it’s temperature equilibrates with the surroundings and your thinking(and any other kind of behavior) stops.

  14. J-Mac:
    stcordova,

    Sal,
    You disagree with Sewell, right?

    Yes.

    I ask a simple question, what has more entropy, a frozen dead rat or a living human? Answer: living human. A 2nd year chemistry, physics, or mechanical engineering student should be able to deduce that by looking at standard molar entropy tables.

  15. keiths:
    J-Mac,

    The system “exports” entropy to its environment.If the amount exported is large enough, the entropy of the system will decrease.

    The exact way in which this happens depends on the nature of the system and its surroundings.

    So how exactly would the system compensate for the loss (while up to small error) of information in this classical translation of the information leading to protein folds?

    DNA > mRNA > amino acid chain

  16. stcordova,

    As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

    My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

    What happened to the information?

  17. J-Mac: So how exactly would the system compensate for the loss (while up to small error) of information in this classical translation of the information leading to protein folds?

    DNA > mRNA > amino acid chain

    You seem to be confusing sequence information with thermodynamic entropy. That’s about the most sense I can make of what you’re asking.

  18. Rumraket: You seem to be confusing sequence information with thermodynamic entropy. That’s about the most sense I can make of what you’re asking.

    Are they separate?

  19. J-Mac:

    stcordova,

    As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

    My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

    What happened to the information?

    Pay no attention to Sal, J-Mac. While he’s right about entropy not being a measure of disorder, he’s wrong about it not being a measure of information.

    It is a measure of information; specifically, the gap between the information specifying the macrostate and the information needed to specify the exact microstate.

    I explain this in that very long comment thread.

  20. J-Mac: Are they separate?

    I don’t know what you mean by separate. When DNA is transcribed to RNA, and then translated into amino acid sequence, those are chemical reactions which means chemical bonds are broken and others are established, which means work is being done. Work takes energy. Chemical bonds being broken and formed results in energy radiating out into the surroundings usually always in large part as heat.

  21. J-Mac: So how exactly would the system compensate for the loss (while up to small error) of information in this classical translation of the information leading to protein folds?

    DNA > mRNA > amino acid chain

    Not really sure what “information loss” you are talking about here, but the answer in any case would be that the system ‘compensates’ by making the surrounding environment slightly warmer.
    How much warmer?
    Well, if your system lost the total amount of information in the human genome every second, it would need to export (at STP) enough energy to melt 1.1 micrograms of ice per year.
    IOW, an miniscule heat flow is sufficient to account for whatever decreases in informational entropy that you might care to imagine.
    Goading kairosfocus with this awkward fact used to be a pastime of mine.

    [Another way to think about it: in terms of entropy, DNA replication produces a lot more thermal entropy than ‘information’. If it didn’t, it wouldn’t happen.]

  22. Seems like the relevant discipline for discussing ID would be chemistry.

    All the pseudo mathematical arguments need to be discussed as chemistry.

    For example, how much more energy is required to produce a beneficial mutation, as opposed to a detrimental one.

  23. Just out of curiosity (and because I didn’t come in at the beginning), is Sewell attempting to support a position that evolution cannot happen without supernatural input? That is, is Sewell making a physical argument, or a religious argument?

  24. stcordova: For a fleeting moment, I thought you were talking about Mr. Fishing Reels and word salads himself, KairosUnFocused.

    No. He’s back over at UD trying to argue that the designer cannot be a material being.

    For a fleeting moment, I thought you were talking about Mr. Fishing Reels and word salads himself, KairosUnFocused.

  25. Flint,

    Just out of curiosity (and because I didn’t come in at the beginning), is Sewell attempting to support a position that evolution cannot happen without supernatural input? That is, is Sewell making a physical argument, or a religious argument?

    He doesn’t explicitly mention the supernatural in this particular article, but he claims that what has happened on earth requires violations of “the generalized second law”, and that such violations can only happen via intelligence, which he seems to think is non-physical:

    Well, now I have to admit that I also have a scheme that I believe can defeat the generalized second law. My scheme is called “intelligence.” But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information, we can watch my scheme create spectacular amounts of order and information every day, in every writer’s office, in every inventor’s lab, and in every R&D division of every engineering firm throughout our civilization. You can even try it yourself, at home.

    ETA: It reminds me of DaveScot’s claim that he violated the second law every time he typed.

  26. Flint:
    Just out of curiosity (and because I didn’t come in at the beginning), is Sewell attempting to support a position that evolution cannot happen without supernatural input? That is, is Sewell making a physical argument, or a religious argument?

    All ID proponents are supporting a religious argument.

  27. In earlier writings (including this paper) Sewell claims that the four fundamental forces are insufficient to explain the eventual appearance of complex artifacts on earth:

    I imagine visiting the Earth when it was young and returning now to find highways with automobiles on them, airports with jet airplanes, and tall buildings full of complicated equipment, such as televisions, telephones and computers. Then I imagine the construction of a gigantic computer model which starts with the initial conditions on Earth 4 billion years ago and tries to simulate the effects that the four known forces of physics would have on every atom and every subatomic particle on our planet. If we ran such a simulation out to the present day, would it predict that the basic forces of Nature would reorganize the basic particles of Nature into libraries full of encyclopedias, science texts and novels, nuclear power plants, aircraft carriers with supersonic jets parked on deck, and computers connected to laser printers, CRTs and keyboards? If we graphically displayed the positions of the atoms at the end of the simulation, would we find that cars and trucks had formed, or that supercomputers had arisen? Certainly we would not, and I do not believe that adding sunlight to the model would help much.

    “Certainly we would not”, he says, without providing a lick of evidence.

  28. Acartia: All ID proponents are supporting a religious argument.

    OK. I guess I don’t understand why it wouldn’t just be easier to say that reality is how his god makes his will actual, rather than trying to deny reality to find his god. Seems like Sewell is going to a lot of unnecessary trouble to avoid saying evolution happens because his god WANTS it to happen, and Designed the universe accordingly.

  29. keiths:
    (quoting Sewell)
    If we graphically displayed the positions of the atoms at the end of the simulation, would we find that cars and trucks had formed, or that supercomputers had arisen? Certainly we would not, and I do not believe that adding sunlight to the model would help much.

    Sewell just goes from particles to supercomputers. Here’s what he fails to mention:

    1. Particles aggregating into atoms.
    2. Atoms bonding together to form chemicals.
    3. Chemicals aggregating into minerals.
    4. Those gravitationally attracting into dust clouds.
    5. A sun forming.
    6. Planets forming.
    7. Nuclear fusion igniting in the sun.
    8. Lots of geology.

    Now even Sewell’s admirers should admit that those can happen by known physics, chemistry, and astronomy. In spite of Sewell trying as hard as he can to make such things sound impossible.

    After that the issue is Origin of Life, evolution, with humans originating and developing civilization. We’re all used to arguing about that. After we have civilization the supercomputers are a done deal.

    Sewell makes it sound as if he has some new argument involving the impossibility of particles accidentally aggregating into supercomputers. But he doesn’t. It’s the same old stuff.

  30. J-mac:

    Are they separate?

    Yes for the most part.

    There may be some adjustment made for the Landaur Principle, but the number of thermodynamic information bits in DNA is astronomically bigger than the information bits in the sequence.

    500 fair coins can be viewed as having 500 bits of information in the heads/tails configuration, but it has an astronomically large number of thermodynamic bits if we just take standard molar entropy numbers — like 10^25 bits!!!!

    See the calculation here (I made a small math mistake which I didn’t correct, but the numbers are in the ballpark):

    2LOT and ID entropy calculations (editorial corrections welcome)

    Dr. Sewell is conflating textbook thermodynamics with configurations of matter that don’t really contribute to thermodynamic entropy (unless one ties the TRIVIAL contribution using Landauer’s Principle and even then, the result is dubious).

  31. Flint,

    OK. I guess I don’t understand why it wouldn’t just be easier to say that reality is how his [Sewell’s] god makes his will actual, rather than trying to deny reality to find his god.

    I think it’s because a non-interventionist God who winds up the world, sits back, and watches things unfold doesn’t jibe well with Christian theology. The Christian God is a god who not only continually fiddles with the world, he enters it himself.

  32. keiths: The Christian God is a god who not only continually fiddles with the world, he enters it himself.

    This is why I find it so amusing when colewd refuses to speculate on if the “designer” acts in “real time” or just once at the beginning.

  33. keiths:
    Flint,

    I think it’s because a non-interventionist God who winds up the world, sits back, and watches things unfold doesn’t jibe well with Christian theology.The Christian God is a god who not only continually fiddles with the world, he enters it himself.

    Ah, thanks. I think I see it now — such a non-interventionist god would be indistinguishable from no god at all, which would imply a little too much reality for perhaps any theistic faith.

  34. stcordova: Yes for the most part.

    I asked this question deliberately ☺

    stcordova: There may be some adjustment made for the Landaur Principle, but the number of thermodynamic information bits in DNA is astronomically bigger than the information bits in the sequence.


    I didn’t want to quibble about that. It doesn’t really matter in regards to a different point I’m trying to make

  35. stcordova: Dr. Sewell is conflating textbook thermodynamics with configurations of matter that don’t really contribute to thermodynamic entropy

    Can you expand on that, please?
    What do you really mean by the configuration of matter?

  36. J-Mac,

    Knowing your attention span, I went looking for a short video explaining why entropy is a measure of missing information, not disorder. The following video explains it well:

    Entropy is NOT About Disorder

    CAUTION: There is one major mistake in the video, which is that he multiplies by Boltzmann’s constant when computing the dice entropy. Dice entropy is not thermodynamic entropy, so Boltzmann’s constant should not be used.

  37. J-mac,

    Entropy can be defined equivalently in a number of ways, and two of the ways have NO relation to information, and the 3rd way does, but is almost never used in a lab setting.

    Clausius indirect definition :

    delta-S = delta-Entropy = Integral ( dQrev/ T)

    Boltzman-Plank

    Entropy = S = kB log W

    where W is number of microstates which is agony to explain!

    Information form (GAG!):

    S = uncertainty, ignorance….. hard to explain

    Most physics and engineering students and chemistry students use the Clausius definition since it’s measurable in a lab setting.

    Keiths likes “Entropy is a measure of ignorance” but measuring ignorance is a lot harder to measure than temperature. 🙂

    But in case you’re curious, I have an excel spreadsheet which you can modify to see how entropy changes based on temperature and volume of a monoatomic gas trapped in a container. It’s not very exciting, it’s for physics nerds. I leverages the Sakur Tetrode equation.

    This link shows two ways to calculate such Sakur-Tetrode entropy, the old fashioned way, and the Keiths way.

    https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation

    Dr. Mike hated both!

  38. J-Mac: Do you view entropy = information, or not?

    You wrote : “ Shannon entropy (or measure of information).


    Depends on your definition of information. As far as Designed Information, or information that makes machines, absolutely “NO”!

  39. stcordova: Depends on your definition of information. As far as Designed Information, or information that makes machines, absolutely “NO”!

    Note the creation of a new idiosyncratic definition of “information”.
    Note the circularity. “Designed” information is information that was designed.

  40. stcordova: Depends on your definition of information.As far as Designed Information, or information that makes machines, absolutely “NO”!

    You must be familiar with the the law of conservation of quantum information?!

  41. stcordova: Entropy can be defined equivalently in a number of ways, and two of the ways have NO relation to information, and the 3rd way does, but is almost never used in a lab setting.

    I’m talking life systems and DNA…

  42. J-Mac: I’m talking life systems and DNA…

    Classical Thermodynamics doesn’t apply, that’s why a living human has more thermodynamic entropy than a dead rat, even though a living human being is a functioning information-filled creature compared to a dead rat.

  43. stcordova: Classical Thermodynamics doesn’t apply, that’s why a living human has more thermodynamic entropy than a dead rat, even though a living human being is a functioning information-filled creature compared to a dead rat.

    Classical thermodynamics of course applies to living human beings. The fact that it does not say anything about some other issue is irrelevant to that.

  44. Sal,

    The missing information interpretation works for all kinds of entropy, including thermodynamic entropy. That (among other things) makes it superior to the interpretations you prefer, including the energy dispersal interpretation.

    And when you measure a thermodynamic variable like temperature, you are automatically missing some information about the microstate of the system. Temperature, after all, is a measure of the average kinetic energy of the molecules in the system. You lose any information about the kinetic energy of specific individual molecules.

Leave a Reply