Entropy and Disorder from a creationist book endorsed by a Nobel Laureate

Here is an online creationist/ID book.
From http://www.ccel.us/gange.app.html#App

“I was particularly pleased with Dr. Gange’s refusal of the idea of materialism, and the convincing arguments supporting that refusal. In fact, the book will be a welcome response to materialism. Good luck, for a good book!”

Eugene Wigner, Nobel Laureate in Physics

The book had an appendix on thermodynamics.

“We noted earlier that entropy can be correlated-but not identified-with disorder. And we said, moreover, that this correlation is valid in only three cases-ideal gases, isotope mixtures, and crystals near zero degrees Kelvin. The truth of the matter is illustrated by considering the two chemically inert gases, helium, and argon.(7) In our mind’s eye we imagine two balloons, one filled with helium and the other with argon. First, we lower the temperature of both balloons to zero degrees Kelvin. This makes all the gas molecules stop moving in either balloon. Next, we get the molecules moving by heating both balloons to 300 degrees Kelvin (room temperature). Were we to mathematically calculate the increase in entropy, we would find that it was 20 percent higher in the argon balloon than in the helium balloon (154 v. 127 joules per mole per degree Kelvin). But since helium molecules are ten times lighter than argon molecules, they are moving three times faster and thus are more disordered. Here, then, is an example where higher entropy is accompanied by lower disorder, thereby demonstrating that we cannot identify one with the other. In the particular example cited, the greater argon entropy comes from the closer quantum translational energy levels identified with its greater molecular mass as described by the Sackŭr-Tetrode equation.

Let’s look at another example. Were we to continue dissolving salt in an isolated glass of water, we’d reach supersaturation, a point where the water could not dissolve any more salt. Under certain conditions, the dissolved salt can be made to separate from the water. When crystallization happens the entropy always increases. However, the temperature can go up or down, depending on the kind of salt used and the thermochemistry of the solution,(8) This means that the motion of the molecules, and therefore the disorder, can go up or down, whereas the entropy always goes up. A less obvious example is the spontaneous freezing of supercooled water.(9) Again we see that the entropy must increase, whereas the disorder can go up or down. “

36 thoughts on “Entropy and Disorder from a creationist book endorsed by a Nobel Laureate

  1. Here’s a link to what appears to be the table of contents of the complete book. Sal’s link is only to the appendix.

    The quote from Wigner appears to be endorsing only what the book author says about materialism. And there is no link or citation provided to allow us to check that Wigner review.

  2. Hi, Sal!

    Can you explain two things to me:

    Firstly, what are you understanding by the term “materialism” here?
    Secondly, how do you think the author is defining “disorder”?

  3. It’s easy to demonstrate that ‘entropy’ and ‘disorder’ are not the same thing. All you need is a deck of cards. If ‘disorder’ and ‘entropy’ genuinely are the same thing, anything that makes entropy go up will also increase disorder; anything that makes entropy go down will also decrease disorder.

    Now, let’s say that a “baseline deck” is one that’s been sorted into suits, in Ace-to-King order within each suit.

    Get yourself a deck of cards, and sort that deck into the ‘baseline’ configuration. Now shuffle the baseline deck.
    Is the shuffled deck’s thermodynamic entropy greater than that of the baseline deck, or less than that of the baseline deck, or the same as that of the baseline deck?
    Is the shuffled deck’s disorder greater than that of the baseline deck, or less than that or the baseline deck, or the same as that of the baseline deck?

    Now leave the deck in your refrigerator for a couple of hours.
    Is the cold deck’s thermodynamic entropy greater than that of the room-temperature deck, or less than that of the room-temperature deck, or the same as that of the room-temperature deck?
    Is the cold deck’s disorder greater than that of the room-temperature deck, or less than that of the room-temperature deck, or the same as that of the room-temperature deck?

  4. Hi Dr. Liddle,

    Wigner probably was not a materialist, he was famous for his view of consciousness being outside the physical world, but important to the physical world. I’m not saying Wigner is necessarily right, but Wiki summarizes Wigner’s sentiments:

    Wigner designed the experiment to illustrate his belief that consciousness is necessary to the quantum mechanical measurement process. If a material device is substituted for the conscious friend, the linearity of the wave function implies that the state of the system is in a linear sum of possible states. It is simply a larger indeterminate system.

    However, a conscious observer (according to his reasoning) must be in either one state or the other, hence conscious observations are different, hence consciousness is not material. Wigner discusses this scenario in “Remarks on the mind-body question”, one in his collection of essays, Symmetries and Reflections, 1967. The idea has become known as the consciousness causes collapse interpretation.

    http://en.wikipedia.org/wiki/Wigner%27s_friend

    Wigner also made a passing remark about Darwin:

    “it is hard to believe that our reasoning power was brought, by Darwin’s process of natural selection, to the perfection which it seems to possess.”

    http://www.dartmouth.edu/~matc/MathDrama/reading/Wigner.html

    Wiki’s bio mentions of Wigner:

    He became interested in the Vedanta philosophy of Hinduism, particularly its ideas of the universe as an all pervading consciousness. In his collection of essays Symmetries and Reflections – Scientific Essays, he commented “It was not possible to formulate the laws (of quantum theory) in a fully consistent way without reference to consciousness.”

    So it is not surprising Wigner endorsed Gange’s book. It would seem, materialism is the rejection of the spiritual world (which is outside of ordinary physical law) — Wigner is suggesting agency outside of physical law.

    With respect to disorder, it seems Gange was referring to the more chaotic (high velocity) movement of helium in one case. In the case of crystallization, order comes about by the separation of the crystal from the fluid, thus the process sorts out the salt from the water, it is localized.

    In the case of salt crystals, we call them ordered because they obey a geometric law spatially speaking. We tend to say things that can be described by a simple geometric law as “ordered”. Hence here is a case where entropy goes up but so does order, the opposite of what most creationists will say.

    I have said entropy is not the same as disorder, and neither does increasing entropy imply increasing disorder. I finally found one good exposition in Gange’s book to support my claim (in addition to what has been stated at TSZ on the matter).

    I’m in the minority of creationists in my view, and it is one of the few times I agreed with many of those at TSZ. I posted it since I thought you might enjoy hearing a dissenting view of the claim “entropy is a measure of disorder”. I found this reference only last month, so I thought I’d share it. I haven’t visited TSZ since August or September, but when I heard Barry was visiting recently, I thought I’d drop in too.

    Happy Thanksgiving (an American holiday), by the way….

    We can’t blame creationists however for saying “entropy is a measure of disorder”, you see it in places way outside creationist literature starting with (gasp) Boltzmann.

    “In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)

    Boltzmann

    A professor remarks:

    That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

    http://entropysite.oxy.edu/boltzmann.html

  5. I’m not blaming anyone for saying that entropy is a measure of disorder, but if you are going to say that disorder ISN’T correlated with entropy, then you need to say how you are defining disorder!

    And happy thanksgiving to you too! (Whenever it is – it’s not a UK thing, we not having made the journey to the New World – we do like turkeys and cranberry though).

  6. On the subject of “materialism” – it’s not at all clear to me what it means.

    I don’t even know whether I am one. It literally depends on what the meaning of “is” is.

  7. Disorder is the opposite of the ordered state. Ordered state is when we can make a more algorithmically compressible (Kolmogorov) description of the locations of the particles or components in the system than the disordered state.

    But really, in a sense, the definition of disorder is not my problem. 🙂

    The burden is on those who insist “entropy is a measure of disorder” to define order and disorder. I leave it to them. I view entropy as the log of the number of microstates, the notion of disorder is not in the famous equation:

    S = k log w

  8. On the subject of “materialism” – it’s not at all clear to me what it means.

    The quote was from Wigner, so in a sense his writings would be the best reference as to what Wigner meant, not me.

    But if we view the physical world as composed of matter, energy, space, etc. It would seem Wigner does not believe conscious minds are fundamentally originating from these, actually the suggestion is almost the opposite, that consciousness, mind is more fundamental than matter, energy, space. etc. Many do not agree with this, but I’m merely stating what it appears Wigner believes. That sentiment was surprisingly echoed by Richard Conn Henry:

    The Quantum Enigma of Consciousness and the Identity of the Designer

    I’m not going to defend the view. I’m obviously sympathetic to it, but the “proof” of who is right is above my pay grade. Suffice to say, I have my opinions, I accept them by faith, and it seems to me the questions might not be formally answerable in this lifetime.

  9. stcordova: Disorder is the opposite of the ordered state.

    The trouble is that neither “order” nor “disorder” has a clear meaning.

    Kolmogorov complexity is not well defined, except asymptotically (as the size of the “ordered” system approaches infinity). And even then, it’s a measure for formal abstract structure, not for physical things.

    I would not blame Boltzmann. He had a specific meaning for “order” that he was using. The problem arises when we assume order has a meaning independent of any particular way of ordering.

    I can’t blame the creationists for the confusion, because there’s a lot of similar confusion coming from AI and probably from parts of philosophy.

  10. Neil Rickert,

    Neil Rickert: The trouble is that neither “order” nor “disorder” has a clear meaning.

    That’s not entirely correct. In certain physical contexts, order and disorder have well-defined meanings. Matter can be in qualitatively different phases, e.g., water and ice or paramagnet and ferromagnet. A phase transition resulting in a lowering of the symmetry is known as an ordering transition.

    For example, water (a liquid) is highly symmetric: rotating it through any angle makes no difference. In contrast, crystalline ice has special axes, so that only a rotation through 60 degrees returns it to the same state. The symmetries of rotation (through arbitrary angles) are said to be broken spontaneously in ice. A broken symmetry is what physicists usually call order. Ice is less symmetric than water and thus more ordered.

    A more ordered (less symmetric) phase usually have less entropy than a less ordered (more symmetric) phase. Ice has lower entropy than water. A ferromagnet has lower entropy than a paramagnet. However, this is not always the case. There are some situations where a solid turns into a liquid when the temperature is lowered. In these cases, the solid has a higher entropy than the liquid. Here is a graduate student’s essay about such re-entrant transitions.

    So order and disorder can be defined in a relative, albeit qualitative, sense. However, disorder (so defined) does not always mean higher entropy. There are known exceptions to this.

  11. Lizzie: I’m not blaming anyone for saying that entropy is a measure of disorder

    Given that disorder is not defined and that entropy is not defined as a measure of disorder anyone claiming that entropy is a measure of disorder should at least be blamed for being ignorant.

  12. Salvador:

    But really, in a sense, the definition of disorder is not my problem.

    It’s your argument. If the definition of disorder is relevant to your argument then it is your problem.

    Salvador:

    But really, in a sense, the definition of disorder is not my problem.

    In what sense is the definition of disorder not your problem?

  13. Hi Sal. When you reference external work would you mind linking to the original source without changing any of the words or context? Thanks!

  14. Mung: Given that disorder is not defined and that entropy is not defined as a measure of disorder anyone claiming that entropy is a measure of disorder should at least be blamed for being ignorant.

    Well, it’s usually stated as an attempt to explain what “entropy” means in some kind of lay language.

    I think it’s a very poor description, myself, because there are many meanings of the word “disorder” that don’t mean “has a lot of entropy”. For instance a tornado had much less entropy than still air.

  15. As far of the rest of the book, I did not read all of it, but some parts seemed to be using different metrics than those I’m familiar with (such as the number of bits in the blue print of a star). The number of bits have a different definition than Shannon’s.

    The reason I learned about the book was that I was explaining to creationists that the 2nd law has little relevance to ID if we go from the Kelvin-Planck or Clausius formulation. They were horrified to hear me disagree with Duane Gish!

    In the discussion, another creationist came to my defense and said, “Sal is right” and pointed to the Appendix in Gange’s book. It was helpful that the dissent from Gish’s claims actually appeared in a creationist book.

    From Gange’s chapter on Entorpy:

    Not only is entropy wrongly identified with disorder; the error has caused some people to introduce a nonsensical thing called “negentropy.”15 This idea assumes that negative entropy exists, and that it can be identified with order. It says that the onset of life was accompanied by a change in negentropy that just balanced the increase in entropy, and that this change explains the order found in life.

    However, the idea of negentropy is quite wrong because it is defective in several basic ways.16 Nevertheless, over the past twenty years a number of people have used this erroneous concept in an attempt to justify the creation of life by natural processes.17 To understand how they do this, picture water in an ice cube tray in a refrigerator. Let the tray represent the earth, the refrigerator represent the sun, and the transition of water into ice, the creation of life. The order that arises when water becomes ice is said to be balanced by the disorder that occurs when the liquid refrigerant changes into a gas (molecules are in greater disarray in a gas than a liquid). The spontaneous creation of life is then justified by saying that the increased order in life corresponds to negentropy that is offset by the greater increase in entropy (disorder).

    But life is complex, nor ordered; and the basic natural process that changes water into ice is counterproductive to the creation of life because it results in a loss of complexity. The reason that ice is ordered and not complex is that ice is made up of millions of tiny atomic units that are identical to each other. This means that if we describe one of them, we will have described all of them. When water changes into ice, we actually need less information to describe the ice because its molecules no longer behave in an independent manner. It is less complex because we need less information to describe it. But unlike ice, life is vastly complex because we require staggering quantities of information to describe even the simplest of cells. Negentropy is thus an erroneous concept.

    ….
    The important truth for our discussion is that life isn’t ordered; it is complex. We saw that clearly in chapter 4 in our consideration of the materialistic world view. An increase in organization of a structure — from simple dust particles to the oriental rug to the vacuum cleaner to the house (to repeat an earlier metaphor) — requires the systematic increase of information, but information is not produced by natural processes in the magnitude necessary to explain the origin of life.

    Let’s summarize what’s been said thus far:

    1. The Second Law requires entropy to increase.
    2. Entropy cannot be identified with disorder.

    3. Negentropy cannot be identified with order.

    4. Ordered molecules (ice) present less information.

    5. Living cells are not ordered — they are complex.

    That seems correct. I’ve heard the term Negentropy float around in the origins debate. I’ve never seen it in Physics textbooks.

    I dropped out of the discussion just when it got interesting when there was talk of the generalized 2nd law of thermodynamics. Gange talks about the generalized 2nd law and information and Maxwell’s daemon. The generalized 2nd law was stated in the appendix. It relates information and the 2nd law.

    I’ve seen treatments relating information with thermodynamics. For example the Landauer principle which says (from wiki)

    Landauer’s principle asserts that there is a minimum possible amount of energy required to change one bit of information, known as the Landauer limit:
    kT ln 2,

    http://en.wikipedia.org/wiki/Landauer's_principle

    There seems to be at least various definitions of what the generalized 2nd law is. I got passing mention of Bekenstein version of the generalized 2nd law in class, but that does not seem to be the generalized 2nd law that Gange is talking about.

    Here is Gange’s treatment in the appendix of the generalized 2nd law:
    http://www.ccel.us/gange.app.html#pp

    So at least on the surface Gange’s book seems to have a correct conception of thermodynamics.

    Where his discussion seemed less coherent was his connecting thermodynamic evolution in the universe with evolution of information. It seemed like hand waving which was contrasted with his excellent discussion of the 2nd law itself, even the generalized 2nd law.

  16. Didn’t Salvador already admit his ignorance regarding physics? Why should we believe he has anything more intelligent to say about thermodynamics in general, and entropy specifically?

    Elizabeth:

    Well, it’s usually stated as an attempt to explain what “entropy” means in some kind of lay language.

    I think it’s a very poor description, myself, because there are many meanings of the word “disorder” that don’t mean “has a lot of entropy”.

    Yes, I pointed out Sal’s confusion over at UD and now I am “banned” from his threads there for being a “troll.”

    But Sal has four degrees. Surely one of them must be relevant!

  17. Mung,

    Consider a living rat initially at room temperature that is taken down to near absolute zero. The previously living rat is now dead and frozen.

    Is the entropy lower or higher for the rat in the frozen-dead state than it was in the living-warm state?

    Did the process of lowering entropy kill the rat?

  18. That’s a puzzling response, if it related to the ‘disorder’ question (or, indeed, to any other!). Imagine boiling a rat … Imagine a rock falling off a cliff onto a rat … Imagine chucking a shoe at a rat … Energy transfer (and hence entropy change) either into or out of the rat can be responsible for its demise if its main energy transfer process (which is chemical and electrical, not thermal) is interrupted.

  19. Sal is going to town at UD about 500 coins and “the chance hypothesis.”

    If you came across a table on which was set 500 coins (no tossing involved) and all 500 coins displayed the “heads” side of the coin, would you reject “chance” as a hypothesis to explain this particular configuration of coins on a table?

    In reply, Mark Frank and Nick Matzke have correctly noted that there are more than one “chance hypothesis” and that

    probability calculations depend on the model that you assume for the process generating the outcomes.

    To sharpen it up, here is a chance hypothesis known as the microcanonical ensemble in statistical physics. All microstates with the same energy are equally probable. In order to evaluate whether an all-heads microstate is compatible with this chance hypothesis, one must specify what the energy of the system is and give rules for calculating it (i.e., specify a Hamiltonian). Let’s take the Ising model in two dimensions, where two coins have a lower energy if they are both heads or both tails. At high energy (or, equivalently, high temperature), we will tend to find equal numbers of heads and tails. At low energy (low temperature), the coins will be preferably heads or preferably tails.

    The above is a concrete chance hypothesis that is compatible with an uneven distribution of heads and tails. Nick and Mark are totally right.

  20. olegt,

    In the statistical mechanics language, the high-temperature state (equal numbers of heads and tails) is known as the paramagnetic state and the the low-temperature state (heads dominate or tails dominate) as the ferromagnetic state. Spontaneously broken symmetry is another technical term.

  21. Richardthughes:

    Has Sal ever got anything involving math right?

    It is strange to watch the crowd over at UD struggling with probability and statistics at the level taught in high school Advanced Placement Statistics. I have known bright 9th grade students who get this stuff easily.

    It appears that Cordova is attempting to derive an ID/creationism version of “statistical mechanics” from the “law of large numbers” in order to “clarify” Granville Sewell’s attempts and beef up ID/creationist “physics”.

    Let’s see what he comes up with on his own. He doesn’t need any input from us; he just has to impress the crowd over at UD, which he appears to be doing just fine at the moment.

  22. NickMatzke_SZ:

    I think he’s an engineer, I hope he’s not building bridges!

    Hee hee.

    I wouldn’t want anyone like him running loose in any of my laboratories; he would wreck the place in an instant.

  23. Mike Elzinga:
    Let’s see what he comes up with on his own. He doesn’t need any input from us; he just has to impress the crowd over at UD, which he appears to be doing just fine at the moment.

    It’s like a Dunning-Kruger convention over there now with Sal and Bully leading the parade.

  24. I think what Sal is up to is that he is assuming that if we see Complex Specified Information, in the form of enough Heads, that it can only be produced by Design.

    But of course there are two problems with that:
    1. The form of it he’s using is not the re-described Dembski version. It is the version that can be achieved, bit by bit, by natural selection. He isn’t ruling that out.
    2. The more recent Dembski description is that you’re only allowed to call it CSI if you also know that it cannot be achieved by natural forces such as natural selection. (In other words, only if you’ve already drawn the desired conclusion, ahead of time). And Sal is not enforcing that condition.

    So even if you see 500 Heads, and even if under some null model such as i.i.d. fair coins they are extremely improbable … so what?

    Of course that’s for the accumulation of CSI by natural selection (or not). They’ll probably run off to the Origin Of Life so as to get away from natural selection.

  25. Joe Felsenstein,

    They’ll probably run off to the Origin Of Life so as to get away from natural selection.

    … and cue the naive notion that racemic mixtures of chiral acids present a ‘coin-tossing’ problem to a peptide assembly mechanism (a curious imagined mechanism that can distinguish one acid from another, but not if they have the same molecular weight, polarity etc – as if the mechanism itself was weighing the molecules, or assessing some other gross, non-positional parameter like charge or hydrophobicity).

  26. Joe Felsenstein:
    I think what Sal is up to is that he is assuming that if we see Complex Specified Information, in the form of enough Heads, that it can only be produced by Design.

    But of course there are two problems with that:
    1. The form of it he’s using is not the re-describedDembski version.It is the version that can be achieved, bit by bit, by natural selection. He isn’t ruling that out.
    2.The more recent Dembski description is that you’re only allowed to call it CSI if you also know that it cannot be achieved by natural forces such as natural selection.(In other words, only if you’ve already drawn the desired conclusion, ahead of time). And Sal is not enforcing that condition.

    So even if you see 500 Heads, and even if under some null model such as i.i.d. fair coins they are extremely improbable … so what?

    Of course that’s for the accumulation of CSI by natural selection (or not).They’ll probably run off to the Origin Of Life so as to get away from natural selection.

    We’ll get what we always do: An old argument that never worked and three new sci ency sounding acronyms. They all want to put their own letters on ‘tornado in a junkyard’. ID now has more acronyms than calculations..

  27. Sorry, I should have clarified that the thread where Sal is talking about 500 heads is actually a thread started by Barry. Sal was doing a lot of commenting in it.

    They apparently think that if an evolutionary biologist admits that 500 coins coming up Heads is a sign of some Design then this validates ID.

    However coins do not have natural selection. If it is 250 nucleotides coming up in a pattern which has a high fitness, then we have to ask whether this could be the result of natural selection. That is not possible with coins.

    If someone says “sure, that pattern of coins indicates something nonrandom”, they will presumably cry “Gotcha!” without explaining to anyone what their argument actually is.

  28. Joe Felsenstein: However coins do not have natural selection.

    They do if we add interaction between coins, as in the Ising model. Arrange the coins on a square lattice and let adjacent coins interact so that the energy of a pair is lowered if they are both heads or both tails, higher if one is heads and the other tails.

    We now specify the chance hypothesis as a uniform distribution subject to a fixed energy. (This is known as the microcanonical ensemble in statistical mechanics). For the energy exceeding a critical value, the number of heads and tails will be the same. However, if the energy is below the critical value than either heads prevail or tails prevail (spontaneous symmetry breaking).

    This is a well-studied model in statistical physics, perhaps the most celebrated phase transition. A “chance hypothesis” can lead either to heads and tails represented equally or to a spontaneous imbalance.

  29. I am not sure that the Ising model is an example of natural selection.

    Alternatively, we could set up a Rube Goldberg machine that sequences DNA and then lays out the coins in a pattern that reflects the DNA sequence achieved by natural selection.

  30. Joe Felsenstein,

    It’s not the same as natural selection, but it can work in a similar way.

    Suppose we start with a completely random state and lower the temperature. States where spins are randomly oriented will be discriminated against and the system will be moving towards lower energy until it reaches thermal equilibrium. The selected states will exhibit magnetic order with either heads prevailing or tails.

  31. Joe Felsenstein: So even if you see 500 Heads, and even if under some null model such as i.i.d. fair coins they are extremely improbable … so what?

    Several years ago I wrote two versions of Dawkins WEASEL program to get some handle on just how powerful natural selection is relative to the interactions among the constituents of an evolving system.

    In one version of the program, I allowed ALL positions of the string to vary randomly for each offspring in each generation. In the other program, only one randomly selected position for each offspring varied in each generation

    In both programs I could dial in varying amounts of “stickiness” (called “latching” in the context of WEASEL) so that, once any given position in the string was a “hit” it could have a given probability of NOT changing in the next generation. In other words, I could dial in the amount of latching.

    I was expecting that the program that allowed ALL positions to mutate in each offspring in each generation would not converge; however it does, even though it usually takes a vary large number of generations.

    I did a further mathematical analysis of each program to find out what was happening and I discovered that, with no latching and all positions allowed to mutate in each offspring in each generation, the program does indeed produce a “fittest” but the dispersion in the final population remains extremely large. As I dial in more “stickiness” the dispersion in the population diminishes; and even with very small amounts of “stickiness” the dispersion in the population at the end of the process becomes fairly small. Of course, if the stickiness becomes permanent, the entire population at the end consists of individuals that are pretty much all alike.

    The two versions of the program converge differently; and if the “stickiness” is chosen properly, one can get an exponential decay curve (straight line on a log-linear plot) of the difference between the offspring and target.

    When the WEASEL program is reinterpreted as a “gas” of particles condensing into a potential well, the “stickiness” represents the shedding of energy in, say, the form of electromagnetic radiation; i.e., the system is gradually “cooling” and settling into its lowest potential energy.

    The surprising part – to me, anyway – was that selection gets the job done even with no stickiness (energy loss); and it can do it even if ALL positions can vary in all offspring in each generation. Selection is very powerful in finding the “fittest” offspring; and it is the amount of “stickiness” that determines how similar the members of the final population are.

Leave a Reply