A Second Look at the Second Law…

…is the title of Granville Sewell’s manuscript that almost got published in Applied Mathematics Letters last year. It was withdrawn at the last minute by the editor, but you can still download the manuscript from Sewell’s web page. The purpose of this thread is to discuss the technical merits of Sewell’s arguments.

In a nutshell, creationists have long argued that biological evolution is associated with increase of information. Because information in Shannon’s sense is, quite literally, the opposite of entropy, the reduction of entropy by evolution appears to contradict the second law of thermodynamics. There is a loophole in this argument: entropy can decrease if a system exchanges heat with the outside world and there is a compensating increase in entropy elsewhere. Back-of-the-envelope estimates show that the increase of information associated with biological evolution is tiny compared to the overall entropy budget of the Earth.

Sewell’s comeback boils down to two points:

  1. The compensation argument makes no sense. An extremely improbable event is not made less improbable because of “compensating” events occurring elsewhere.
  2. Informational entropy and thermal entropy are unrelated quantities, so adding them makes no sense. The 2nd law must be applied to each of them separately.

Thus biological evolution contradicts the 2nd law of thermodynamics.

Sewell made these points and supporting examples both in this manuscript and elsewhere. This thread is intended for a discussion of the technical merits of Sewell’s arguments. Philosophical and metaphysical meanderings will be swiftly removed to the sandbox.

49 thoughts on “A Second Look at the Second Law…

  1. How do you calculate the probability of an evolutionary event? Lensky asserted that in his lab-bound biosphere, every possible point mutation occurred, making the probability of finding any particular mutation one.

  2. Speaking of words that have different definitions, especially in different contexts, entropy is one of them. Some people have said that entropy has nothing to do with order/disorder or information but Wikipedia and other sources say otherwise. I’m no expert on entropy or thermodynamics and I can only roughly follow all the debates about them but I think I understand enough to see that Sewell and other IDists are wrong, but I can also see why there is a lot of confusion and disagreement about what entropy actually is and how it applies to life or evolution and I think that both ‘sides’ need to be more clear about what the hell they’re talking about.

    No wonder there are so many arguments about it.

  3. Informational entropy and thermal entropy are unrelated quantities, so adding them makes no sense. The 2nd law must be applied to each of them separately.

    A couple of questions for Sewell:

    If informational entropy and thermal entropy are unrelated quantities then what reason do we have for thinking that a law of thermodynamics applies to information (however it is defined) at all?

    If the so-called information content of biological structures is encoded in the physical arrangement of certain types of molecules then how does that violate 2LoT? Is Sewell claiming that energy was not spent forming that arrangement and that the waste heat from that process was not dumped into the environment thereby increasing its entropy?

  4. Creodont: Some people have said that entropy has nothing to do with order/disorder or information but Wikipedia and other sources say otherwise. I’m no expert on entropy or thermodynamics and I can only roughly follow all the debates about them but I think I understand enough to see that Sewell and other IDists are wrong, but I can also see why there is a lot of confusion and disagreement about what entropy actually is and how it applies to life or evolution and I think that both ‘sides’ need to be more clear about what the hell they’re talking about.

    I hope this thread will provide a side benefit of clarifying the confusion about entropy. Entropy is a well-defined technical term. It is easy to compute for simple physical models. We will rely on such simple examples here.

  5. Seversky: If informational entropy and thermal entropy are unrelated quantities then what reason do we have for thinking that a law of thermodynamics applies to information (however it is defined) at all?

    I think Sewell means the second law generalized to these other kinds of entropy. We will discuss his “X-entropy” to see whether this idea has any technical merits.

    Seversky: If the so-called information content of biological structures is encoded in the physical arrangement of certain types of molecules then how does that violate 2LoT? Is Sewell claiming that energy was not spent forming that arrangement and that the waste heat from that process was not dumped into the environment thereby increasing its entropy?

    It would be nice if Sewell participated in this discussion. I doubt that he will come because he has not responded to a previous invitation from Liz, but who knows? Other than that, we have Sewell’s AML manuscript as well as his previously published papers (here) and a post at ENV where he presents his case. There is a lot to go on.

  6. We will begin with the first point Sewell makes. At ENV, he wrote:

    Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of “compensating” events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal—and the door is open. (Or the thermal entropy in the next room is increasing, though I am not sure how fast it has to increase to compensate computer construction!)

    For the moment, let us set aside the issue of different types of entropy (we will return to it later) and focus on the highlighted first sentence of the above quote. Does the idea of compensation make sense in the context of thermal entropy? You bet it does! You can perform an experiment illustrating that without leaving your kitchen.

    Pour a glass of warm water and put an ice cube in it. The water will be cooling down and its entropy will be decreasing. The cooling could not have occurred spontaneously: the decrease in entropy is too large to allow this process to occur on its own. What allows this highly improbable event to occur is a compensation of the type Sewell says never happens. The ice cube is warming up. Although the water, considered in isolation, is entering a vastly less probable state, the entire system (water and ice) is going to a more probable state when the temperatures of water and ice move toward an equilibrium.

    To use some math, as water cools, it transfers an amount of heat Q to the ice cube. The entropy of water goes down by Q/T_w. Giving off 1J of heat at room temperature T_w = 300 K reduces the water entropy by S_w = 1/300 J/K. It means that the number of microstates available to water goes down by a factor exp(S_w/k_B), where k_B = 1.38 × 10^{-23} J/K is the Boltzmann constant. That’s exp(S/k_B) = exp(2.42×10^20). This is an enormously large reduction of the available phase space. There is no way this can happen spontaneously. So what makes it possible?

    It’s the compensation happening in ice. Having received Q = 1 J of heat at a lower temperature T_i = 265 K, the ice cube increases its entropy by S_i = Q/T_i = 1/265 J/K. That means that the number of microstates available to it goes up by exp(S_i/k_B) = exp(2.73×10^20). The increase in the number of microstates of ice is also enormously large. It is vastly larger than the reduction factor for the number of microstates available to water. The combined system, water + ice, gets access to more microstates than before. Their number is up by exp((S_i−S_w)/k_B) = exp(0.32×10^20).

    So compensation certainly works as far as thermal entropy is concerned. It would be extremely improbable for a glass of water at room temperature to spontaneously cool down by even a fraction of a degree. The 2nd law prohibits that. However, if the corresponding entropy decrease is compensated by an equal or larger entropy increase in a cube of ice then the process is allowed to occur.

    In this example, we computed the probabilities using thermodynamics. Water and ice are fairly complicated physical systems, so it is not possible to compute the numbers of microstates directly in a statistical-mechanical approach. I might take up another example of atomic or nuclear magnetic moments in a magnetic field, where statistical mechanics is quite simple.

  7. olegt: Pour a glass of warm water and put an ice cube in it.

    An even simpler example, without the phase change: Consider a volume of air. On half of the volume, the air is cool; on the other half, the air is warm. Considering the vast number of air molecules, having all the warm molecules on one side is extraordinarily unlikely by chance. The distribution of air molecules is highly ordered. Yet, this is a common natural situation, such as due to solar warming.

  8. Here is a simple physical model in which we can take a closer look at entropy and even compute it by using combinatorics. In addition to being a great pedagogical aid, this model describes some real physical systems. We will again see that compensation works, another counter example to Sewell’s point.

    Many atomic nuclei have magnetic moments. In an applied magnetic field, a nuclear magnet points either along the field (up = U), in which case its energy is low, or against the field (down = D), a state of high energy. We will denote the energy difference ε. In a system with N nuclear magnets, the state of lowest energy is achieve when all magnets line up with the field.

    Our system will initially be closed in the sense that the magnets can only exchange energy among themselves. If one magnet flips from U to D then another must flop from D to U so that the energy of the system stays the same. As the dipoles flip and flop, the total numbers of U and D dipoles, N_u and N_d, stay unchanged.

    Let’s say we have a smallish system with N = 100 dipoles. The state of lowest energy is unique, with all dipoles in the U states. This phase space contains just 1 state. Next up in energy are states with one misaligned dipole (energy ε). There are 100 possible states in this phase space. Microstates with 2 misalinged dipoles (energy 2ε) are more numerous: there are 100×99/2 = 4950 ways to place the two D dipoles. The largest phase space has 50 U and 50 D dipoles (energy 50 ε). There are 100!/(50! ×50!) = 10^29 states. Working with such large numbers is inconvenient, so we switch to their logarithms. The lowest-energy state has entropy ln 1 = 0, the first excited state has S = ln(100) = 4.6. The state with energy 2ε has S = ln(4950) = 8.5. The state with energy 50 ε has S = ln[100!/(50! ×50!)] = 66.8.

    Suppose our system begins in a state of highest entropy, 50 U and 50 D. As we let the magnets flip and flop, what are the chances that they will end up in U states? That’s actually impossible by construction: the energy of our system must stay the same. In this case, energy conservation alone prevents the dipoles from lining up with the field.

    Let’s modify the problem a bit and add a thermal reservoir, which is just another system with a much larger of dipoles, say 10,000 or even a million. The system and the reservoir are allowed to exchange energy, so in principle it is conceivable that all of the dipoles in the system will be in the U state. Whether it has a realistic chance of happening depends on the state of the reservoir.

    If the reservoir also has equal numbers of U and D moments then the answer is no. When the system is in thermal contact with this reservoir, all of its 2^100 microstates are equally likely to appear. The chance of landing in a particular state with 100 Us is 1/2^100, or about 10^{-30}. Not gonna happen.

    However, if the reservoir is in a state of lowest energy, all U, then the situation is quite different. When the system and the reservoir equilibrate, the 50 misaligned dipoles will be shared between the system and the reservoir. If the reservoir is at least 50 times larger than the system then it is likely that the misaligned dipoles will be in the reservoir an the system will have all its dipoles up.

    So compensation works again. Although the entropy of the system must go down in order for all of its dipoles to line up, the decrease in its entropy is more than compensated by an increase of entropy in the reservoir. The calculations are substantially simple to be checked by anyone familiar with combinatorics.

  9. I would offer a few suggestions based on decades of experience with the kind of misconceptions that often get introduced into physics despite the best intentions of instructors. Much of this information has been researched and catalogued by the Physics Education Research community within the American Association of Physics Teachers since the 1960s.

    There has been a lot of study over the years since at least the 1960s about the problems of equating disorder with entropy. There is never any need to do so, and it always leads to confusion. I have a shelf full of the best thermodynamics and statistical mechanics textbooks ever written, and most of them are still in print and still used after 50 years. Not one of them equates disorder with entropy; EVER.

    Not one ever makes any connection of entropy with information. Not one ever mentions anything about “compensation.” These other words simply are not necessary; and they add nothing but confusion. If one has inadvertently picked up the habit of using these words in connection with entropy and the second law of thermodynamics, one needs to break the habit.

    There is no such thing as “compensation.” When the entropy of a system decreases – for example, when atoms or molecules condense into a crystalline array – the overall entropy of system plus environment increases because photons are created and leave the system. Photons are bosons; and when they are created in the process of something condensing and cooling, they ARE the new energy microstates that carry energy out of the system.

    Further, once a system of atoms and molecules condenses to the point where the atoms and molecules are strongly interacting among themselves, not only do photons carry energy away, but so do phonons passing through the forming structure to the boundaries of the structure and into the surrounding environment. Particles can also carry energy away.

    There is no such thing as “compensation.” The word is never mentioned in any of the best textbooks. Energy states are created in the form of photons, phonons, or other particles. Photons, phonons, and particles; think of radiation, conduction, and convection in microscopic terms. Think Feynman diagrams. Energy and momentum are conserved but spread around in new photon, phonon, or particle states. Where they go depends on the system and its environment.

    “Disorder” has been another bugaboo misconception; but many systems can have changing entropy and yet there is no change in the “order” of anything in the system. We saw an example of a two-state system on another thread and on this thread. In another example, adding energy to a crystalline system consisting of a collection of Einstein oscillators doesn’t change the order of anything. The mean position of the atoms and molecules remains constant for small changes in energy even though entropy changes. Those mean positions begin to change as the oscillations cease to be harmonic and the system begins to dissociate.

    This battle to get terms like “disorder” and “compensation” out of the popularizations and textbooks for non-majors has been going on since the 1970s and 1980s after a series of bad popularizations and the scientific creationists started spreading these memes. It has been an annoying and uphill battle ever since. Frank L. Lambert, Emeritus Professor of Chemistry at Occidental College, has been battling to get these errors out of chemistry textbooks for at least a couple of decades now.

    There has been progress; and the Physics Education Research community has been making recommendations that have been finding their way into the more recent textbooks for non-majors and in the newer textbooks for physics majors.

    The best textbooks over the last fifty years have never made those mistakes. The newer textbooks are beginning to address these misconceptions directly.

  10. Part of Sewell’s problem is that he misunderstands the compensation argument, as this quote (from his ENV post) reveals:

    If you want to show that evolution does not violate the second law, you cannot simply say, sure, evolution is astronomically improbable, but the Earth is an open system, so there is no problem as long as something (anything!) is happening outside the Earth that, if reversed, would be even more improbable.

    No one is claiming that a bucket of lukewarm water can spontaneously change to half ice, half steam as long as there is a compensatory entropy increase in the Andromeda Galaxy. The entropy increase has to be caused by an interaction between the system and its surroundings, as in olegt’s two examples.

    Sewell almost gets it when he writes:

    if an increase in order is extremely improbable when a system is isolated, it is still extremely improbable when the system is open, unless something is entering (or leaving) which makes it not extremely improbable.

    It’s just that he doesn’t seem to understand that energy entering and leaving the system in the form of radiation is enough to allow for local decreases of entropy on earth, with no violation of the second law.

  11. The fundamental problem is that intelligence=magic. Intelligence has no limits to its capabilities. Intelligence is non-physical.

  12. Mike Elzinga: I have a shelf full of the best thermodynamics and statistical mechanics textbooks ever written, and most of them are still in print and still used after 50 years. Not one of them equates disorder with entropy; EVER.

    Agreed.

    Not one ever makes any connection of entropy with information. Not one ever mentions anything about “compensation.” These other words simply are not necessary; and they add nothing but confusion. If one has inadvertently picked up the habit of using these words in connection with entropy and the second law of thermodynamics, one needs to break the habit.

    Stat mech textbooks are written from a physicist’s perspective. They may not be the best source for relatively new cross-disciplinary topics such as this one. Here is an excerpt from Charles Bennett’s article Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon, arXiv:physics/0210005.

    Landauer noted that the logical state often evolves irreversibly, with two or more distinct logical states having a single logical successor. Therefore, because Hamiltonian/unitary dynamics conserves (fine-grained) entropy, the entropy decrease of the IBDF* during a logically irreversible operation must be compensated by an equal or greater entropy increase in the NIBDF and environment. This is Landauer’s principle.

    I think you should not fixate on unfamiliar terminology and get to the point.

    *IBDF = information-bearing degrees of freedom.

  13. keiths: No one is claiming that a bucket of lukewarm water can spontaneously change to half ice, half steam as long as there is a compensatory entropy increase in the Andromeda Galaxy. The entropy increase has to be caused by an interaction between the system and its surroundings, as in olegt’s two examples.

    In fairness to Sewell, that is not his misunderstanding. In the AML article he quotes Asimov, who suggests that the reduction of entropy on Earth is compensated by an increase of entropy in the Sun. That is complete nonsense. Furthermore, Sewell is aware of the local variants of the compensation argument. In the AML article he writes:

    Some other authors appear to feel a little silly suggesting that increases in entropy anywhere in the universe could compensate for decreases on Earth, so they are careful to explain that this ‘‘compensation’’ only works locally; for example in Order and Chaos [4], the authors write:

    In a certain sense the development of civilization may appear contradictory to the second law. . . . Even though society can effect local reductions in entropy, the general and universal trend of entropy increase easily swamps the anomalous but important efforts of civilized man. Each localized, man-made or machine-made entropy decrease is accompanied by a greater increase in entropy of the surroundings, thereby maintaining the required increase in total
    entropy.

    Let’s not caricature Sewell’s position.

  14. olegt: Stat mech textbooks are written from a physicist’s perspective. They may not be the best source for relatively new cross-disciplinary topics such as this one. Here is an excerpt from Charles Bennett’s article Notes on Landauer’s principle, reversible computation, and Maxwell’s Demon, arXiv:physics/0210005.

    Landauer is part of my generation; I am quite familiar with his work.

    Landauer didn’t make the mistake of disconnecting the state of an array of bits from the amount of energy required to flip them from one state to another. Recall that early core memory involved arrays of millions of little magnetic toroidal rings with read/write wires threading them. It took energy both to write and read them.

    I remember these machines well as they entered an intense computational stage. You could hear the core memory of these machines whistle as millions of rings switched during read/write operations. There was no question that energy was being consumed in both the write and readout of these cores.

    So there was a one-to-one correspondence between bits and energy; but the primary point to always remember in thermodynamics and statistical mechanics is that it is always about energy. A bit, whether read or written, involves the transfer of discrete units of energy.

    That remains true of computers today; even though core memory is a thing of the past. Any transition from low to high or from high to low is an electromagnetic pulse that comes from a source and gets dumped to a sink.

    Contrary to Landauer and the general fields of physics, ID/creationists seem to think “information” is free and that it moves atoms and molecules around without expending energy or being detectable.

  15. Oleg,

    I haven’t caricatured Sewell’s position. He genuinely misunderstands the compensation argument. Read his words again:

    If you want to show that evolution does not violate the second law, you cannot simply say, sure, evolution is astronomically improbable, but the Earth is an open system, so there is no problem as long as something (anything!) is happening outside the Earth that, if reversed, would be even more improbable.

    And from the AML article:

    Of course the whole idea of compensation, whether by distant or nearby events, makes no sense logically: an extremely improbable event is not rendered less improbable simply by the occurrence of ‘‘compensating’’ events elsewhere. According to this reasoning, the second law does not prevent scrap metal from reorganizing itself into a computer in one room, as long as two computers in the next room are rusting into scrap metal—and the door is open.

    Sewell’s version of the compensation argument is a strawman.

  16. keiths: I haven’t caricatured Sewell’s position. He genuinely misunderstands the compensation argument. Read his words again:

    That is precisely what we are doing in this thread. Since we don’t have Sewell himself to talk to, we are reading his articles and discussing the arguments contained in there.

    Sewell’s version of the compensation argument is a strawman.

    Indeed, there are parts of the article whose sole purpose is to ridicule what he perceives to be the position of his opponents. I am not interested at all in discussing these things. If there is interest in doing just that, by all means let’s open a new thread and have at it. As I wrote in the opening post, this thread is intended for a discussion of the technical merits of Sewell’s arguments.

  17. I have argued several times (in posts at Panda’s Thumb) that Sewell’s argument, if accepted, would prove that plants can’t grow. A single seed becomes a plant that has multiple seeds. That is an increase in the local concentration of energy. Sewell, implicitly arguing about the biosphere, says that such an increase is not consistent with the 2LOT

    … if all we see entering is radiation and meteorite fragments, it seems clear that what is entering through the boundary cannot explain the increase in order observed here.

    The “radiation” entering the growing plant is of course solar radiation, which explains where the energy content of the plant comes from. Sewell is right that any old compensation is not good enough, but in this case the energy coming from the sun is actually entering the plant. So by implying that the 2LOT is inconsistent with the energy concentration in an evolving biosphere, Sewell has simply misapplied the 2LOT by ignoring an important energy flow into life, a flow that also goes into each growing green plant.

    So am I wrong? Have I misunderstood him?

  18. Joe,

    That’s right. By Sewell’s argument, the second law is violated by plant growth, by solar, wind, and hydroelectric power, and by fossil fuels, all of which are ways (direct and indirect) by which incoming solar radiation can be used to create local decreases in entropy (by running a refrigerator, say).

    His choices seem to be:
    1) to admit that his argument is flawed; or
    2) to maintain that all of the above are exempt from the second law, perhaps because they (in his view) are all designed.

    I’m sure even Sewell realizes how ridiculous it would be to argue that a solar-powered refrigerator violates the second law.

  19. Oleg,

    Indeed, there are parts of the article whose sole purpose is to ridicule what he perceives to be the position of his opponents. I am not interested at all in discussing these things.

    He’s not just ridiculing the compensation argument, he’s trying to refute it.

    As I wrote in the opening post, this thread is intended for a discussion of the technical merits of Sewell’s arguments.

    Which is what both of us are doing — I, by pointing out that Sewell is battling a strawman version of the compensation argument, and you, by giving two examples of how the compensation argument actually works.

  20. keiths:
    Joe,

    That’s right.By Sewell’s argument, the second law is violated by plant growth, by solar, wind, and hydroelectric power, and by fossil fuels, all of which are ways (direct and indirect) by which incoming solar radiation can be used to create local decreases in entropy (by running a refrigerator, say).

    His choices seem to be: 1) to admit that his argument is flawed; or 2) to maintain that all of the above are exempt from the second law, perhaps because they (in his view) are all designed.

    I’m sure even Sewell realizes how ridiculous it would be to argue that a solar-powered refrigerator violates the second law.

    I wouldn’t be so sure. He may not be evaluating the 2LOT, but the C2LOT, the Creationist Second Law of Thermodynamics, according to which once something is designed by intelligence, it is allowed to violate the 2LOT. But of course he is telling his audience that he is evaluating thr 2LOT, not the (imaginary) C2LOT.

    As for where Sewell is making technical mistakes, I would point to two:

    1. His X-entropies, even if the equations for them are correct, isolate the concentration of one quantity from all others, not allowing chemical or nuclear reactions to create or destroy the substance. For example, we could make an X-entropy for carbon dioxide. We could have equations for the changes in concentration of CO2, but these would not have terms for the creation of CO2 by respiration or by some geochemical processes, and they would not have terms for the destruction of CO2 by photosynthesis. So the equations can be correct but their application to the real world wrong.

    2. Leaving aside the issue of X-entropies and just looking at the energy flows, Sewell wants to argue that his math shows that evolution cannot make organisms more complex and energy-rich. Here he gets very handwavy and vague, and that is telling. In fact he is ignoring the role of solar radiation in powering the processes in the biosphere. He just says that “all we see entering [the biosphere] is radiation” and expects his readers to dismiss the idea that this radiation could be important. In short, even if his equations are all correct, he has msiapplied them by ignoring a major fact explained in science classes. Not just in college science classes, not just in secondary school science classes, but in middle-school science classes.

    To my mind, evaluating his equations is not the main issue, given this outrageous mistake he made in applying them.

  21. Joe Felsenstein: I have argued several times (in posts at Panda’s Thumb) that Sewell’s argument, if accepted, would prove that plants can’t grow. A single seed becomes a plant that has multiple seeds. That is an increase in the local concentration of energy. Sewell, implicitly arguing about the biosphere, says that such an increase is not consistent with the 2LOT

    Joe, I think you are referring to this post at Panda’s Thumb. In it, you extend Sewell’s argument, then show that the extended argument leads to a contradiction. In this thread, I would like to engage his original arguments directly. As far as I can see, they boil down to points 1 and 2 listed above.

  22. If we take a string of letters that codes for some protein and have a look at the fundamental processes that occur, what we see is that there is replication of that string, changes in the copies of that string and selective pruning of the copies of that string. We’ll pick something nice and simple – say thermal. so some strings are more likely to survive in elevated temperatures than others. After a number of generations in a single contained environment, there will be one or more new strings that are more capable of surviving in that environment than the parent. That is evolution in its most basic state, and it underpins all of evolution – fundamentally in every evolutionary change, this is what is happening. Sewell et al intentionally try to pick up higher level concepts in order to obfuscate the issue, but they have nothing, that demonstrates any violation of any law of physics, either experimentally or conceptually.

  23. olegt:
    … In this thread, I would like to engage his original arguments directly. As far as I can see, they boil down to points 1 and 2 listed above.

    Point 2 of your post was:

    Informational entropy and thermal entropy are unrelated quantities, so adding them makes no sense. The 2nd law must be applied to each of them separately.

    I have just read Sewell’s AML not-quite-article again. I don’t see point 2 there. Where do you see him making that argument?

  24. Again, to refute Sewell just demonstrate that blind and undirected chemical processes can produce a living organism from non-living matter.

    Nothing else will do.

  25. From that October 31, 2011 Sewell “defense” of his rejected paper we read

    According to Styer and Bunn, the Boltzmann formula, which relates the thermal entropy of an ideal gas state to the number of possible microstates, and thus to the probability of the state, can be used to compute the change in thermal entropy associated with any change in probability: not just the probability of an ideal gas state, but the probability of anything. This is very much like finding a Texas State Lottery sheet that lists the probabilities of winning each monetary award and saying, aha, now we know how to convert the probability of anything into its dollar equivalent.

    Styer and Bunn said no such thing; and they would never make such a stupid assertion that it can be used to compute the change in probability of anything. The concept of entropy has an entire context and whole set of interrelated concepts in which it is used properly. Boltzmann’s formula doesn’t just apply to an ideal gas. Sewell has no idea what it means or how it is used in thermodynamics and statistical mechanics. This is Sewell misconstruing concepts he knows nothing about. And he is doing it to mock things he has never understood.

    The larger question is why he would do this publicly, right out in the open, and without ever leaning what the concepts of entropy and the second law are all about. He continues to double down and repeat his kvetching over on UD without ever checking to see if what he is claiming makes any sense.

    If he really understood entropy and the second law, and if he really had anything of importance to say about either of them, he would have submitted his “paper” to Physical Review Letters because it would be so important that it would change everything anyone knows about physics.

    But we know why he didn’t do this. Either he knows deep down that what he is claiming is pure crap, or he is so delusional that he really thinks he has outflanked the entire physics community. Or maybe it is just pure wickedness in feeding a bunch of poor rubes a bunch of phony arguments so that they will make complete fools of themselves as they march forth boldly and impale themselves on swords and spears they don’t even know exist.

  26. I don’t think the 2nd law should be used to argue against the increase in information. When some disk drives are being input with information they heat up and thus increase in temperature and entropy.

    In creating an informed structure, like say a disk drive with more information, or a biological organism, the symbolic system reduces “configurational entropy” while possibly increasing thermal entropy.

    I don’t think Dr. Sewell’s formulation is correct. See my public position here:

    http://idea-gmu.blogspot.com/2006/03/bad-arguments-from-2nd-law-of.html

  27. ID arguments would have a better foundation if they began with ideas borrowed from statistical mechanics rather using the 2nd law. The 2nd law is the wrong theorem to being work with. Since the 2nd law can be derived from statistical mechanics, the principles of statistical mechanics are a more fundamental set of axioms.

    But even then, the body of literature for statistical mechanics, as it stands won’t fit well with ID conceptions, since quantities of interest to design arguments (such as symbols) are not great interest to the traditional field of statistical mechanics. However, some of the math from that discipline can be reused since Shannon borrowed some of it in his theories.

  28. I think this thread is a very good idea, though it may be a bit late, since Sewell’s ideas have not changed much since the original Mathematical Intelligencer (MI) articles in 2000/1. As a general comment, I am not sure that the two points in the starting entry from olegt are that separate in Sewell’s presentation, and this produces problems in trying to deal with the X-entropy idea. Mike Elzinga’s notes are excellent!

    My own thoughts on this have appeared in a MI article, “Is There Any Conflict Between Evolution and the Second Law of Thermodynamics?” It is only available on-line at the moment, published 24th February 2012, as a “Viewpoint” (DOI:10.1007/s00283-012-9277-0). In summary, Sewell’s description of his X-entropies, and his thought experiment in the MI article, lead to the conclusion that various well-known and well understood physical effects, such as thermoelectric potential differences (Seebeck effect), and thermally generated concentration changes (Soret effect) should not occur. I also speculate on how this mistaken idea of X-entropies may have arisen.

  29. bobl,

    Hi Bob,

    Thanks for your very nice and pertinent comment! Your viewpoint article in The Mathematical Intelligencer does a great job dissecting Sewell’s paper.

    I might add that Sewell’s “X-entropies” seem to be well-understood thermodynamic quantities. For example, his “carbon entropy” is simply the chemical potential.

  30. olegt:
    bobl,

    Hi Bob,

    Thanks for your very nice and pertinent comment! Your viewpoint article in The Mathematical Intelligencer does a great job dissecting Sewell’s paper.

    I might add that Sewell’s “X-entropies” seem to be well-understood thermodynamic quantities. For example, his “carbon entropy” is simply the chemical potential.

    Hi Oleg,
    Thank you for the first paragraph.

    However, whatever the X-entropies are, they are definitely NOT chemical potentials! The chemical potential is temperature dependent, and this is exactly what Sewell will not allow for an X-entropy; if his carbon entropy were temperature dependent his MI thought experiment would not work for him. The description of my argon version of this experiment could be re-written in terms of chemical potential instead of pressure, but that might make it less easy to follow…..

  31. Postscript to above- the Mathematical Intelligencer article referred to in my first post has now appeared in the paper format, Volume 34, 1 (Spring Issue), 29.

  32. I saw Sewell’s latest rebuttal last night. It is not entirely new. It recycles portions of his previous ENV article “More Philosophical than Scientific”: Parsing a Rationalization that I previously mentioned.

    In that earlier article he ranted about different kinds of entropy being independent and not able to compensate for each other’s decrease. That point was mentioned only in passing in the AML piece, but an expert could see this crack-pottery from miles away. Bob Lloyd surely did and gave Sewell some well-deserved spanking in the MI letter.

    Sewell’s reaction? Furious back-pedaling and dog-ate-my-homework excusess. 🙂 I’ll explain that later, when WP stops acting up.

  33. I wish they would stop confusing censorship with quality control. Sewell’s paper has not been censored. It is actually directly linked to in Bob Lloyd’s article, in its “In Press” format, and so any MI readers who are interested can access it directly.

  34. Sewell, in that latest “defense” of himself, clearly doesn’t understand the history of the conflation of disorder with entropy. As a result, he doubles down on his assertion that his “X-entropies” are contributing to disorder.

    It appears that, in Sewell’s mind, simply adding all the disorders together is equivalent to increasing entropy. That seems to be the basis of his claim that his “X-entropies” are independent of each other; hence, this appears to be the reason he throws anything, including the kitchen sink, into his equation.

    But entropy is not disorder; it never has been, and never will be. Sewell quotes Lloyd, who says,

    It is difficult to know where this mistaken idea, that entropy can be separated into independent components, has come from. One possibility is that this comes from assuming a precise equivalence between entropy, to which the formalisms of thermodynamics apply, and disorder, which is too ill-defined for thermodynamics to be applied.

    This goes right over Sewell’s head. Sewell still has absolutely no comprehension of the meaning of entropy, nor does he have the historical perspective of the problem physicists have been battling as a result of the memes propagated by Henry Morris, Duane Gish, and the other creationists beginning the 1970s. There have also been somewhat similar misconceptions propagated during that same time by a few well-meaning authors of physics texts for non-majors; I have a couple of these on my bookshelves, and the physics community was aware of the problem at the time.

    Unfortunately, the Creationists of that time were extremely effective in getting others, especially the general public, to adopt Creationist misconceptions and memes, including the meme that entropy is disorder. Creationists got the attention of the news media and were able to obtain multi-page spreads in local newspapers that willingly propagated their message.

    Sewell is mindlessly continuing the tradition of spreading the memes without checking the real physics textbooks on thermodynamics and statistical mechanics. Textbooks written back then are still classics, are still in print, and are still used in courses for physics majors. None of these textbooks makes the mistake of equating entropy with disorder. Yet Sewell attempts to defend himself with endless pseudophilosophy about the meanings of words and the meanings of meanings of words.

    No matter how ID/creationists try to rationalize it, the notion that entropy equals disorder is meaningless. Entropy is NOT disorder. Entropy also has nothing to do with “information.” That conflation is also meaningless.

    Sewell’s defensive remarks reveal many other serious misconceptions he still harbors in his imagination; and these are the typical shibboleths that identify most ID/creationists. If Sewell had anything serious to say about the second law of thermodynamics and entropy, the most logical place for him to have sent his paper would be Physical Review Letters.

  35. Perhaps one of you guys would like to start a new thread on this? Otherwise I will sticky this one for a bit (although we have a few stickied threads right now, so a new one would be better).

  36. The entropy of [a] system is the average heat capacity of the system averaged over its absolute temperature … Entropy should not and does not depend on our perception of order in the system. The amount of heat a system holds for a given temperature does not change depending on our perception of order. Entropy, like pressure and temperature is an independent thermodynamic property of the system that does not depend on our observation.

    Entropy Is Not Disorder, Steve Donaldson, January 4th 2011

  37. MI’s publisher Springer may get some foretaste of what will happen when they reject to publish Biological Information: New Perspectives

  38. This reply comes very late, but, just for the record, here’s the following:

    If the body is subject to no interactions other than changes in external conditions, it is said to be thermally isolated.  It must be emphasized that, although a thermally isolated body does not interact directly with any other bodies, it is not in general a closed system, and its energy may vary with time. . . .

    This leads to the result that the law of increase of entropy is valid not only for closed systems but also for a thermally isolated body, since here we regard the external field as a completely specified function of co-ordinates and time, and in particular neglect the reaction of the body on the field.  That is, the field is a purely mechanical and not a statistical object, whose entropy can in this sense be taken as zero…

    Let us suppose that a body is thermally isolated, and is subject to external conditions which vary sufficiently slowly.  Such a process is said to be adiabatic.  We shall show that, in an adiabatic process, the entropy of the body remains unchanged, i.e. the process is reversible.”
    [Landau and Lifschitz, Statistical Physics, 1969 edition. p. 37] 

    A membraned sacule close to a thermal vent is thermally isolated.  Any temperature changes that occur occur slowly, and, on average, would likely be almost zero, that is, adiabatic–and, hence, “reversible“.

    We can thus conclude that the ‘entropy’ of the sacule will, at most, not change.  
    (Unless, of course, you believe that a tornado blowing through a junkyard can build a Boeing 747.)

    This is the problem that evolution faces.  Common sense tells us this much, right? 

  39. Indeed! I don’t know if Oleg still lurks here or whether anyone else wants to pick up on your point. Anyway, welcome Lino. Your first comment was held in moderation and as you have posted again I am assuming it is a duplicate. I’ll delete it but not permanently in case  you’d prefer the first version to go through.  Let me know.

    Added in edit:

    Could you explain what you mean by “membraned sacule”? Google gives me “saccule” as “bed of sensory cells in the inner ear”?

    Regarding thermal vents, the hot water spilling into the surrounding very cold water demonstrates turbulence at the margin. I would have thought, far from stable temperatures, these could momentarily swing over a huge range, maybe 3 -400°C

  40. This is the problem that evolution faces.  Common sense tells us this much, right? 

    What, exactly, is the “problem” you see?  

  41. Lino di Ischia,

    I think there are two problems with your point. One is that evolution is not about the origin of life, but about the behaviour of reproducing, interacting populations once it has started. So it’s not a problem that ‘evolution’ faces, though it is a problem for OoL chemists.

    The other issue is that thermodynamics is not all about heat. Heat is often the means by which energy leaves a system that is following its thermodynamic gradient, but I have never encountered anyone suggesting that heat is the source of energy for living systems. The likeliest source of primitive biological energy IMO is the tendency of electrons to follow a gradient of electronegativity. It is the basis of ‘spontaneous’ chemistry – the classic exothermic reaction. Energy is conserved, so if the resulting state has less than the starting state, it must go somewhere: hence the heat.

  42. Just for the record Lino, presumably you are not the first person to notice this “problem that evolution faces”. 

    Can you link to a more formal description of the problem couched in biological terms?  

  43. Hello Lino,

    You are confused about adiabaticity and thermal isolation. In an adiabatic process, a physical object (e.g., a gas contained in a cylinder) exchanges no heat with the environment and its mechanical variables (e.g., volume) are changed slowly. Again, two conditions are required for adiabaticity: thermal insulation (lack of heat exchange) and slowness of mechanical changes.

    The slowness of mechanical changes alone does not mean that a process is adiabatic. A slow isothermal expansion of a volume of gas is not an adiabatic process. Its temperature is kept constant at the expense of bringing in heat from the outside.

    In your example, a saccule is not thermally insulated from the environment. It digests food and dumps heat into the environment, as all organisms do. In the process, it converts low-entropy chemical energy into high-entropy heat, thus producing an enormous amount of entropy, which dwarfs any amount of information you can reasonably imagine.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.