2nd Law of Thermodynamics — an argument Creationists and ID Proponents should NOT use

ID proponents and creationists should not use the 2nd Law of Thermodynamics to support ID. Appropriate for Independence Day in the USA is my declaration of independence and disavowal of 2nd Law arguments in support of ID and creation theory. Any student of statistical mechanics and thermodynamics will likely find Granville Sewell’s argument and similar arguments not consistent with textbook understanding of these subjects, and wrong on many levels. With regrets for my dissent to my colleagues (like my colleague Granville Sewell) and friends in the ID and creationist communities, I offer this essay. I do so because to avoid saying anything would be a disservice to the ID and creationist community of which I am a part.

I’ve said it before, and I’ll say it again, I don’t think Granville Sewell 2nd law arguments are correct. An author of the founding book of ID, Mystery of Life’s Origin, agrees with me:

“Strictly speaking, the earth is an open system, and thus the Second Law of Thermodynamics cannot be used to preclude a naturalistic origin of life.”

Walter Bradley
Thermodynamics and the Origin of Life

To begin, it must be noted there are several versions of the 2nd Law. The versions are a consequence of the evolution and usage of theories of thermodynamics from classical thermodyanmics to modern statistical mechanics. Here are textbook definitions of the 2nd Law of Thermodynamics, starting with the more straight forward version, the “Clausius Postulate”

No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

and the more modern but equivalent “Kelvin-Plank Postulate”:

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

How then can such statements be distorted into defending Intelligent Design? I argue ID does not follow from these postulates and ID proponents and creationists do not serve their cause well by making appeals to the 2nd law.

I will give illustrations first from classical thermodynamics and then from the more modern versions of statistical thermodynamics.

The notion of “entropy” was inspired by the 2nd law. In classical thermodynamics, the notion of order wasn’t even mentioned as part of the definition of entropy. I also note, some physicists dislike the usage of the term “order” to describe entropy:

Let us dispense with at least one popular myth: “Entropy is disorder” is a common enough assertion, but commonality does not make it right. Entropy is not “disorder”, although the two can be related to one another. For a good lesson on the traps and pitfalls of trying to assert what entropy is, see Insight into entropy by Daniel F. Styer, American Journal of Physics 68(12): 1090-1096 (December 2000). Styer uses liquid crystals to illustrate examples of increased entropy accompanying increased “order”, quite impossible in the entropy is disorder worldview. And also keep in mind that “order” is a subjective term, and as such it is subject to the whims of interpretation. This too mitigates against the idea that entropy and “disorder” are always the same, a fact well illustrated by Canadian physicist Doug Craigen, in his online essay “Entropy, God and Evolution”.

What is Entropy? by Tim Thompson

From classical thermodynamics, consider the heating and cooling of a brick. If you heat the brick it gains entropy, and if you let it cool it loses entropy. Thus entropy can spontaneously be reduced in local objects even if entropy in the universe is increasing.

Consider the hot brick with a heat capacity of C. The change in entropy delta-S is defined in terms of the initial hot temperature TH and the final cold temperature TM:

Supposing the hot temperature TH is higher than the final cold temperature TM, then Delta-s will be NEGATIVE, thus a spontaneous reduction of entropy in the hot brick results!

The following weblink shows the rather simple calculation of how a cold brick when put in contact with a hot brick, reduces spontaneously the entropy of the hot brick even though the joint entropy of the two bricks increases. See: Massachussetts Institute of Technology: Calculation of Entropy Change in Some Basic Processes

So it is true that even if universal entropy is increasing on average, local reductions of entropy spontaneously happen all the time.

Now one may argue that I have used only notions of thermal entropy, not the larger notion of entropy as defined by later advances in statistical mechanics and information theory. But even granting that, I’ve provided a counter example to claims that entropy cannot spontaneously be reduced. Any 1st semester student of thermodynamics will make the calculation I just made, and thus it ought to be obvious to him, than nature is rich with example of entropy spontaneously being reduced!

But to humor those who want a more statistical flavor to entropy rather than classical notions of entropy, I will provide examples. But first a little history. The discipline of classical thermodynamics was driven in part by the desire to understand the conversion of heat into mechanical work. Steam engines were quite the topic of interest….

Later, there was a desire to describe thermodynamics in terms of classical (Newtonian-Lagrangian-Hamiltonian) Mechanics whereby heat and entropy are merely statistical properties of large numbers of moving particles. Thus the goal was to demonstrate that thermodynamics was merely an extension of Newtonian mechanics on large sets of particles. This sort of worked when Josiah Gibbs published his landmark treatise Elementary Principles of Statistical Mechancis in 1902, but then it had to be amended in light of quantum mechanics.

The development of statistical mechanics led to the extension of entropy to include statistical properties of particles. This has possibly led to confusion over what entropy really means. Boltzmann tied the classical notions of entropy (in terms of heat and temperature) to the statistical properties of particles. This was formally stated by Plank for the first time, but the name of the equation is “Boltzmann’s entropy formula”:

where “S” is the entropy and “W” (omega) is the number of microstates (a microstate is roughly the position and momentum of a particle in classical mechanics, its meaning is more nuanced in quantum mechanics). So one can see that the notion of “entropy” has evolved in physics literature over time….

To give a flavor for why this extension of entropy is important, I’ll give an illustration of colored marbles that illustrates increase in the statistical notion of entropy even when no heat is involved (as in classical thermodynamics). Consider a box with a partition in the middle. On the left side are all blue marbles, on the right side are all red marbles. Now, in a sense one can clearly see the arrangement is highly ordered since marbles of the same color are segregated. Now suppose we remove the partition and shake the box up such that the red and blue marbles mix. The process has caused the “entropy” of the system to increase, and only with some difficulty can the original ordering be restored. Notice, we can do this little exercise with no reference to temperature and heat such as done in classical thermodynamics. It was for situations like this that the notion of entropy had to be extended to go beyond notions of heat and temperature. And in such cases, the term “thermodynamics” seems a little forced even though entropy is involved. No such problem exists if we simply generalize this to the larger notion of statistical mechanics which encompasses parts of classical thermodynamics.

The marble illustration is analogous to the mixing of different kinds of distinguishable gases (like Carbon-Dioxide and Nitrogen). The notion is similar to the marble illustration, it doesn’t involve heat, but it involves increase in entropy. Though it is not necessary to go into the exact meaning of the equation, for the sake of completeness I post it here. Notice there is no heat term “Q” for this sort of entropy increase:

where R is the gas constant, n the total number of moles and xi the mole fraction of component, and Delta-Smix is the change in entropy due to mixing.

But here is an important question, can mixed gases, unlike mixed marbles spontaneously separate into localized compartments? That is, if mixed red and blue marbles won’t spontaneously order themselves back into compartments of all blue and red (and thus reduce entropy), why should we expect gases to do the same? This would seem impossible for marbles (short of a computer or intelligent agent doing the sorting), but it is a piece of cake for nature even though there are zillions of gas particles mixed together. The solution is simple. In the case of Carbon Dioxide, if the mixed gases are brought to a temperature that is below -57 Celcius (the boiling point of Carbon Dioxide) but above -195.8 Celcius (the boiling point of Nitrogen), the Carbon Dioxide will liquefy but the Nitrogen will not, and thus the two species will spontaneously separate and order spontaneously re-emerges and entropy of the local system spontaneously reduces!

Conclusion: ID proponents and creationists should not use the 2nd Law to defend their claims. If ID-friendly dabblers in basic thermodynamics will raise such objections as I’ve raised, how much more will professionals in physics, chemistry, and information theory? If ID proponents and creationists want to argue that the 2nd Law supports their claims but have no background in these topics, I would strongly recommend further study of statistical mechanics and thermodynamics before they take sides on the issue. I think more scientific education will cast doubt on evolutionism, but I don’t think more education will make one think that 2nd Law arguments are good arguments in favor of ID and the creation hypothesis.

190 thoughts on “2nd Law of Thermodynamics — an argument Creationists and ID Proponents should NOT use

  1. The hypothesis of magical causation of historical events is anti-scientific in the same sense that it would be anti-scientific for a detective to propose that a murder was committed by ghosts.

    That is not exactly correct because murders can be committed by humans.  We are aware of mechanisms which can do this so there is no need to resort to miraculous explanations.    

    In contrast, the emergence of Turing machines of life from dead chemicals has never been demonstrated experimentally and there are a few non-ID scientists that believe it never will.     Even Dawkins thinks this a unique event in the entire universe.  Whether magic was the cause or not, it is still a unique event, but at some point unique events might be indistinguishable from miracles.   Whether miracles are caused by God or purely physical agencies maybe outside the realm of science, but asserting that an event is unique and not in accordance with observed mechanisms is a scientific claim.

    It would even be scientific to demonstrate the correlation and similarity of living systems to intelligently designed objects like computers and make statements about the probability of such things emerging spontaneously.  Physicist Yockey (Oppenheimer’s student) did just that.  He is widely viewed as a father of bioinformatics…..

    We can estimate what the mechanism had to do and perhaps how quickly the mechanism had to put together the parts before they got destroyed by chemical and physical processes like hydrolysis, etc.  We can estimate how much it would have to appear like a computer manufacturer to achieve these ends.  If such a mechanism looks like an inteligent designer, so what. You can assume the mechanism is mindless, isn’t God, but has a comparable tool set to make life.  The emergence of life can still be fully “naturalistic” in principle even though it “accidentally” looks like the product of design or God.

    “a scientist may view design and its appeal to a designer as simply a fruitful device for understanding the world, not attaching any significance to questions like whether a theory of design is in some ultimate sense true or whether the designer actually exists.”  Bill Dembski 

    So one does not have to invoke magical powers as explantions for life, maybe just mechanisms that have the ability to make computer-like machines from dead chemicals and which only appear to behave like intelligent designers but in fact are not. I personally don’t believe that, but it doesn’t strike me as unscientific to take inventory of what the mechanisms of creating the first life had to achieve.  If the necessary tool set of the mechanisms looks like an intelligent designer, so what.  If it makes you feel better, you can assume its mindless and mechanical as your computer and that the mechanism that made life only behaves like a conscious intelligence.  Thats what Hoyle (an atheist/agnostic) did….

  2. stcordova,

    I’m sorry to hear that this post has/will negatively impact your friendships, but I do appreciate your contribution, especially your conduct in the comments section. The ID supporters I’ve encountered here would do well to learn from your approach.

  3. You remain blind to science history. All the physics and chemistry we know Was once explained by animistic spirits. 

    The problem with ID is not that it is wrong, neither right nor wrong. It exists in the realm of useless conjectures that cannot be investigated. 

    The naturalistic explanation for the origin of life could be wrong, but it gives rise to research. 

    That is the difference between science and anti-science. 

    What is a “dead” chemical? 

  4. Forensics as I understand it is a process of fitting known mechanisms to a set of observations. The theory of evolution works the same way – we observe the results, we identify (and test) the mechanisms, we make the theory consistent with all KNOWN mechanisms.

    Now, OOL’s mechanism, if unknown, does not allow forensic analysis. There’s no way to fit unknown mechanisms to a hypothetical event. We can ask if it’s possible to discover mechanisms that would make OOL possible, and we can search for the mechanisms. But to reject it altogether, in favor of magic, out of pure ignorance IS pure ignorance. 

  5. You yourself barely understand enough science and thermodynamics

    That’s true, I barely understand anything.  Why do you think I’m here.  I have access to you and Olegt who were kind enough to give me needed information. :-)  

    “you will not be able to hold out against your “cohorts”  over at UD.

    But I’m not alone, you guys helped me, and as you can see, no one at UD was able to stand in the way of truth.  See, we ARE able to cooperate on occasion, despite our mutual hostility to each other.  Now WHY would I do this?  It is a disservice to misteach physics and math.  And ultimately it is a disservice to ID proponents and creationists to mis teach math and science to them and their children.

    That is why I’m willing to admit when I don’t understand something in math and physics.  How could I live with myself knowing I passed on something sub par?

    What grates even more is the fact that ID/creationists, with their sensitive egos and lust for celebrity, want to throw their misconceptions and misrepresentations into the learning paths of young students who have the potential to push those frontiers of science in the future. The public school systems in the US are generally a poorly financed and poorly supported mess, with teachers becoming political scapegoats in nearly every major political election.

    ID/creationism, as you well know, contributes to this mess; and those illegitimate organizations like the DI, AiG, and the ICR are a big part of that mess in many states. Those organizations have not earned a placed in the science curriculum, yet they try to bypass every quality control and peer review check to get directly to other people’s children. I consider that dishonest, verging on criminal.

    First off, I am not involved in the public school science teaching debates.  I may have had interest in the topic some years ago, so you may find me taking sides in the past.  But I have since chosen not to be involved.

    I have since lost interest, not because the issue isn’t important, but the high schools in the USA have a lot deeper problems than the ID issue.

    Students are flunking out of high school, and those that get through have awful reading, writing, math and science skills.  Evolution is only a fraction of what will be taught even to biology majors in college, but high school students barely have enough to get to college, so we have bigger problems than the ID issue.

    And to express my motivations.  I AM favorable and enthusiastic to the notion of teaching ID and creation theory in Religion and Philosopny classes in Universities, and I am wanting to help prepare and vet such materials for such classes.  These classes exist.  One of the UD contributors, Rob Sheldon, taught such a class in University of Alabama.  This is a good thing for a Physics PhD to teach a philosophy class and give philosophy and religion majors some exposure to a scientist.  Allen MacNeill taught a course at Cornell….

    With respect to your colleagues, here is my interaction with Eugenie Scott on the idea.  She gave it here blessing:


    SALVADOR CORDOVA: “Do you oppose the offering of courses on Intelligent Design and/or Creationism in the Philosophy and Religion Departments of secular universities?”

    EUGENIE SCOTT: “No. They are quite appropriate for such courses. In general, in American universities, Religion departments offer scholarly analysis of religion, rather than devotional study, for which one would seek a seminary. Certainly the c/e controversy is a public controversy that bears studying as a public controversy (that’s why I wrote my book, after all!) Whether ID is a valid scientific or philosophical or theological approach can also be determined at the university level, and certainly is more appropriately determined there than by the local school board.

    I think ID is more likely to be taught in philosophy and religion departments than in science departments, because a course on ID as science would have to be pretty short. What do you say after you say, “we can detect design that is the result of intelligence?” Even assuming the concepts of IC and CSI are valid, what good are they? How do they help us understand the natural world, especially biological phenomena (which is the claim.) Pretty soon a college level course in ID would devolve into “evidence against evolution” (EAE), trotting out the tired examples most of which we first heard from henry Morris decades ago. And that is a waste of time.

    In her criticism of ID, she actually gave good pointers on what NOT to teach to make the course meaningful.  Maybe we won’t achieve your desired aims of improving math and science substantially in high school, but  maybe students of religion and philosophy or students taking religion and philsophy courses in college will get more exposure to good science through such a debate.    A favorite topic I would be interested in exposing students to is the misconcpetion of thermodynamics and how it is mis-used by creationists!   Why would I do this.  I hate seeing Christian children taught bad science.  It will give them a disadvantage in the world and the bright minds will not go far.

    Now, how in the world can a YECist child learn science?  He will have to learn where mainstrean thought disagrees with his religious notions.  He might have the recourse that someday his unsuppported notions might some how find resolutoin, and then move on.  In fact, science moves forward even though we have many unresolved and conflicting theories, and the exploration of the universe is a means of resolving those conflicts.  If a student is willing to be honest with the facts, he will be willing to explore nature.  If all his paradoxes and conflics and knowledge were perfect there would be no need to explore nature.  It is the very fact that he may want to resolve those conflicts that he will explore nature more deeply, and the exploration of nature is the heart of science…..

    That is the path, as a financial engineer, I have taken up as side hobby.  I am sympathetic to YEC, but I can also articulate how mainstream physics and observational evidence is inconsistent with YECism and pointed out that on evidential grounds it is almost without hope (but not formally speaking impossibe).  I think that is a fair statement.  I articulated some of the problems myself here:


    YEC students need to know what they are up against if they want to challenge mainstream physics.  They might be more sympathetic to learn the issues if it comes from someone like me versus you! 

    They might be willing to try to understand mainstream cosmology and relativity and electromagnetism and chemistry if a YEC sympathizer are willing to be straight with them on the facts and challenges, but is not simultaneoously hostile to their religious beliefs.  If YEC can be overturned or strengthened by YECists learning more science, maybe that is less relevant than students learning science in the first place.

    In a venue like a religion course, where it is fair game to have a lively debate, students might be more willing to learn the issues versus having it crammed down their throat by someone openly hostile to their religious views.  The question is how to structure a religion course   where the instructor can be somewhat impartial in grading.  It can be done, and as I have time you will see how I would lay out such a course.

    Allen MacNeil, a biology professor at Cornell laid out an example, and it is on his model I’m willing to build and augment.  There are other models out there.  I would probably want to emphasize use of computers since student need access to materials since most philosophy and religion professors are usually not the best teachers of basic science.  It would be more appropriate for students to learn relevant topics from specialists in the field like mathematicians and physicists and chemists and biologists and geologists. 

    Who would be qualified to write materials for the ID/Creation side?

    Richard Sternberg:  Evolutionary Biology

    Chemistry and Computer Science:  Donald Johnson

    Bio Chemistry:  Michael Behe

    Biology and Genetics:  John Sanford, Allen MacNeill

    Evolutionary Algorithms:  Robert Marks  

    Physics:  Any good secular professor, even one critical of YEC, in fact that would be better!

    Notice, I didn’t mention any ICR aor AIG people.  You of course will have your champions to list for the Evolution anti-YEC side.  The anti-YECist will be have an abundance of qualified people, but I think you will find it challenging to find OOL and evolutionary types that can challenge someone of Sternberg and Sanford’s caliber.   But we won’t know until we allow such a debate to move forward will we?  A college religion and philosphy or a college debate class are appropriate venues for such a debate.

  6. stcordova,

    I’m sorry to hear that this post has/will negatively impact your friendships, but I do appreciate your contribution, especially your conduct in the comments section. The ID supporters I’ve encountered here would do well to learn from your approach.

    Thank you for the kind words. 

  7. I don’t think ID advocates would like the results of a real curriculum devoted to philosophy of science. 

    I say this because I have observed for the last ten years that ID advocates don’t allow open discussion on their sites. Either comments are closed or opponents are banned.

    The greatest danger to religion is open discussion.

  8. The greatest danger to religion is open discussion.

    So then you would be open to college venue where students can have access to criticisms and ideas from ALL sides of the issue since you think open flow of information will somehow destroy ID and creationist ideas.  I obviously think the opposite, but we can agree an open debate is a good thing (and not some hour long venue or blog, but where essays and books can be studied).

    Because students have varying levels of scientific knowledge, it would be hard to administer such a course in the traditional fashion.  Science and Engineering majors will come to the table of such a venue with a different knowledge base than Theater and Music majors.   Worse, most religion professors would probably not feel comfortable discussing physics and chemistry or probability theory.

    But solutions are available in the modern computer era where essays can be written and tests prepared by various authors which can be administered through computers.    It will not be as good as traditional tests since computer administered tests will have multiple choice answers, but this is still better than nothing.  Futhermore, this is the testing method used by the Federal Aviation Administration to test pilots.  They actually give the pilots a several hundred page book with all the test answers!  Clearly if they can pass such a test, they will have at least had to have read the material.  The wording of the tests does not have to measure agreement with an idea, but merely understanding of what the idea says, like:  “Evolution is a theory which states…..”

    The purpose of such tests is to ensure the students have read the material.  It is more difficult to determine comprehension and understanding in this way, but this is a start.  The professor of course can supplement computerized tests with his own tests to test comprehension and understanding vs. pure reading, but the challenge is that it might fall out of his specialization.

    A very simple test can still go a long way.  For example, list a bunch of theories and match them to the individual most recognized for articulating that theory.  The will at least learn some basics.

    Ask the students what geological strata dinosaurs will be found in.  These are basic rote memory learning that is valuable.

    Ask the student which creationists use the 2nd Law to defend creationism, and which don’t.

    So a variety of learning modules are available to the student appropriate to his interest and knowledge base.  The instructor can then decide which modules are required reading (like say Eugenie Scott’s book) and Don Johnson or Illustra media books.  And then allow students the ability to study modules of their own choosing.  You might be surprised, but caroline crocker (a ballet student before becoming a biologist) has an innovative way to use dance to illustrate protein synthesis.  :-)

     Rote exam question can be given just to test if they’ve at least read or watched the subject matter in question.   

    One will say this isn’t deep learning.  OK fine, but then how is this much different than what is learned by history majors.  At least students have access to the debate, which a better situation than they have now.                                   

    One may argue, “students can just learn this on their own”.  True, but students can learn lots of history on their own too, they don’t need to pay $300,000 to do so. But the college credit is given in recognition of their time spent studying.   They can study philsophy of Sarte, or they can at least learn a little ablut the sceintific debate over origins and learn a little basic science on the way, and maybe have a little more appreciation for science in the process…..

    How can so much info be crammed into one course.  Why should it be only 1 course, if students want 3 or 4 semesters of this good stuff, more power to them!  

  9. Salvador,

    Thank you for explaining your point of view.  It appears that your support of ID is based on an argument from personal incredulity.  I would suggest that, if you were to investigate the topics you mentioned about the evolution of four chambered hearts or echo location, you would find considerable peer reviewed work that answers your initial questions and provides a roadmap for further investigation.

    When you say “If evolutionary theory were as well established as electromagnetism . . . .” you demonstrate your lack of knowledge of the field.  Modern evolutionary theory is at least as well established as electromagnetism, by any reasonable measurement criteria.  Thousands of scientists across dozens of disciplines are actively increasing our knowledge daily, supporting and extending the theories that make up what we know of evolution.

    The only reason ID proponents and other creationists use political means to attack the teaching of evolution is because it conflicts with their religious beliefs.  There is no scientific argument.

  10. Salvador,

    Before ID could be taught in any classroom, it would need to be defined.  ID has no hypothesis, it explains no observations, and it makes no testable predictions.  Concepts like Behe’s irreducible complexity and Dembski’s CSI fall to pieces under the most cursory of examination.  Every other argument in favor of ID has recognizable antecedents in the creationist movement.

    As it stands now, ID is nothing more than an argument from incredulity, a movement made up of people standing on the sidelines of science trying to poke holes in modern evolutionary theory without the necessary background to understand it.  There is nothing to teach.  The only possible academic department for an ID class is political science.

  11. stcordova: So then you would be open to college venue where students can have access to criticisms and ideas from ALL sides of the issue since you think open flow of information will somehow destroy ID and creationist ideas.

    That would not bother me, as long as ID is not being misrepresented as science.  Put it in a philosophy class, or a “controversial topic” class, but not in a science class.

    Thinking in terms of design is common.  There’s no point in trying to block it.  But let’s keep it honest and not pretend it is science.

  12. Cordova says: In her criticism of ID, she actually gave good pointers on what NOT to teach to make the course meaningful. Maybe we won’t achieve your desired aims of improving math and science substantially in high school, but maybe students of religion and philosophy or students taking religion and philsophy courses in college will get more exposure to good science through such a debate. A favorite topic I would be interested in exposing students to is the misconcpetion of thermodynamics and how it is mis-used by creationists! Why would I do this. I hate seeing Christian children taught bad science. It will give them a disadvantage in the world and the bright minds will not go far.

    Unfortunately, you and the other members of the ID/creationist community – especially YECs – are not qualified to teach such subjects. Every one of you has hundreds of serious misconceptions about all areas of science that you don’t even know about. For example you still haven’t passed that little concept test and worked through the implications; and that is just a tiny sample of concepts in thermodynamics and statistical mechanics alone.

    Then there is the rest of physics, chemistry, geology, and biology. Every one of these subjects has been mangled beyond recognition by ID/creationists. The process of mangling these concepts has been a formal program in the ID/creationist community for decades. And I suspect you are at least vaguely aware of the fact that this remains an ongoing program of bending and breaking basic science concepts in order to make them comport with sectarian dogma.

    So if you don’t think ID/creationist pseudoscience should be taught in science classes, why would you misinform philosophy students, seminary students, and other non-majors in science?

    You are supposed to know that the US Constitution guarantees that sectarians can worship as they please in their churches. You are even allowed to build the tenets of your beliefs on the pillars of pseudoscience. But that freedom also applies to other religions who don’t happen to agree with or like pseudoscience and who would never consider making it a part of their religious beliefs. And many of those other religions have no problem with any of modern science. Yet your religion tends to despise those other religions or any other Christians who also have no problems with science.

    That same Constitution also says that proselytizing sectarians such as yourself do not have the right to use the institutions of secular government to foist your sectarian dogmas onto others. You are fed and protected by a secular society and by people who don’t hold your beliefs. You have no entitlement to behave like ungrateful parasites constantly pouting about not being allowed to proselytize in the secular classrooms where others have to get the educations they need for their careers.

    I also happen to think the ID/creationism could be a topic in some type of college-level general education course about science and intellectual history in the West. However I would include it as a classic example of a pseudoscience; and I would be sure that students had access to the well-documented history of its sectarian-motivated socio/political tactics that attempt to foist this onto secular society while avoiding the vetting processes that occur in real science. ID/creationism shares most of the same characteristics one finds in other pseudosciences; so the comparisons are instructive.

    On the other hand, there are already some very good courses about science and intellectual history that include the influences of religion throughout history. There is nothing that ID/creationism can add to these courses except as simply another example of human reactions to the upsetting threats to their world views.

    But I’m not alone, you guys helped me, and as you can see, no one at UD was able to stand in the way of truth.

    That remains to be seen.

  13. The greatest danger to religion is open discussion.

    Indeed! The free exchange of ideas ought to be sacrosanct. I fully support anyone’s right to believe what seems best to them. But I vehemently oppose any attempt by religious groups to impose their view over the wider comunity. Religious claims are no different from advertising claims: bogus claims should be exposed and people who make bogus religious claims should be as liable as bogus advertisers.

  14. The interesting thing about the design inference is that forensic sciences infer design because it has been observed, and we know the capabilities to the designers, This is true even of structures made by birds and insects.

    The ID movement is the only forensic “science” to forbid speculation about the actual history of artifacts and the characteristics and capabilities of their putative maker.

  15. That’s a really good point. Religion is selling its product in the same sense and using the same techniques as commercial advertisers. 

    Of course we do allow commercials in schools, particularly on television shows. So perhaps that’s not the best example.

    Teachers do, however, comment on the truthfulness of commercials and sometimes do a bit of teaching on critical thinking. 

  16. As fascinating as all these offshoot topics are …  Sal, did Granville Sewell ever clarify exactly what version of the Second Law is the one he thinks does contradict evolution?  In his reply to you at Uncommon Descent he said, as you quoted him:

    Obviously the origin and evolution of life do not violate the second law as stated in the early formulations you quote.

    OK, if we can set those aside, what “later formulations” of the Second Law does he think do the job? His “X-entropy” arguments have been torn into fairly small shreds in threads at Panda’s Thumb and here.

  17. Here is the answer to your question, Joe. I have highlighted the formulations of the second law Sewell finds to his liking. 

    Obviously the origin and evolution of life do not violate the second law as stated in the early formulations you quote, but there are many formulations of this law, some more general than others. For example, Kenneth Ford in “Classical and Modern Physics” writes “There are a variety of ways in which the second law of thermodynamics can be stated, and we have encountered two of them so far: (1) For an isolated system, the direction of spontaneous change is from an arrangement of lesser probability to an arrangement of greater probability. (2) For an isolated system, the direction of spontaneous change is from order to disorder.” The early formulations are just applications of this more general principle to thermal entropy. Even many adamant opponents of ID recognize that the second law can be applied much more generally than you apply it, for example Isaac Asimov, in the Smithsonian Magazine, wrote “we have to work hard to straighten a room, but left to itself, it becomes a mess again very quickly and very easily… How difficult to maintain houses and machinery, and our own bodies in perfect working order; how easy to let them deteriorate. In fact, all we have to do is nothing, and everything deteriorates, collapses, breaks down, wears out—all by itself—and that is what the second law is all about.”

  18. This may deserve its own thread.  Granville Sewell just posted this on UD:

    As you keep reminding us, Cornelius, most of the strongest arguments for evolution are of the form “God wouldn’t have done things this way”.

    Apparently we can add evolutionary theory to thermodynamics on the list of Things Granville Sewell Doesn’t Understand.

  19. Behe knows what a nasty bugger God is, what with His fascination with disease and parasites. Almost as varied as beetles.

  20. I wouldn’t be at all surprised if Sewell were taught as a child that addition is communative, associative AND distributive because God chose to do things that way.

  21. I saw that comment at UD, and considered responding.  I decided that it wasn’t worth the effort.  But it was worth a laugh.

    Of course, Sewell does not understand evolution.  It seems to be quite impossible for people to understand evolution, if their religion requires that they not understand it.

  22. Thanks, Oleg.   I think I saw that and then forgot it.  He is pretty vague about how to compute this “order” and how it contradicts evolution.  I suppose it is the X-entropy arguments, but he’s pretty inexplicit.

    I suppose the first example I would use against the X-entropy stuff if asked is that elements actually interact in what is called “chemistry”.   And by the magic of chemistry, carbon from carbon dioxide is concentrated in photosynthesis into carbohydrate compounds, and ends up more concentrated than it started.   So much for X-entropy of carbon tending to increase.   (I know, the X-entropy arguments have already been shredded, this is a just an attempt at being more dramatic).

    So he doesn’t have an argument that we can see.  If he has a proof, I suppose he’ll reveal it someday.

  23. “Heat Death of the Sun” was the title of a short story published in the Isaac Asimov Science Fiction Magazine. I’m nerdy enough to remember it. It was by a female writer and was about housekeeping.

    So in the hands of creationists, fictional metaphors — and humorous ones at that — become laws of nature. 

  24. That’s the author, and here’s the story.



    In Gibb’s Universe order is least probable, chaos most probable. But while the Universe as a whole, if indeed there is a whole Universe, tends to run down, there are local enclaves whose direction seems opposed to that of the Universe at large and in which there is a limited and temporary tendency for organization to increase. Life finds its home in some of these enclaves.

    (50) Sarah Boyle imagines, in her mind’s eye, cleaning, and ordering the great world, even the Universe. Filling the great spaces of Space with a marvellous sweet smelling, deep cleansing foam. Deodorizing rank caves and volcanoes. Scrubbing rocks.

    (54) She begins to cry. She goes to the refrigerator and takes out a carton of eggs, white eggs, extra large. She throws them one by one onto the kitchen floor which is patterned with strawberries in squares. They break beautifully. There is a Secret Society of Dentists, all moustached, with Special Code and Magic Rings. She begins to cry. She takes up three bunny dishes and throws them against the refrigerator; they shatter, and then the floor is covered with shards, chunks of partial bunnies, an ear, an eye here, a paw; Stockton, California, Acton, California, Chico, California, Redding, California Glen Ellen, California, Cadix, California, Angels Camp, California, Half Moon Bay. The total ENTROPY of the Universe therefore is increasing, tending towards a maximum, corresponding to complete disorder of the particles in it.

    Apparently this story was rather famous and might have inspired Asimov to make his ill-advised comment, leading ultimately to the Sewell quote-mine. The story is from 1967. When was Asimov’s comment made?

  25. Apparently Asimov’s article was June 1970, so it could very well be that Sewell’s metaphor for entropy originated with this short story. If it didn’t originate there, it still might be the popularizer..

  26. I think what is more serious is that Granville Sewell doesn’t understand intelligence.  In fact, I don’t see many IDists showing much interest in intelligence, oddly.

    But he does seem to think that intelligence violates the 2LoT.

  27. I find it revealing that they don’t show any interest in the process of design and how it would be different from evolution.

  28. :-)

    Given the way you phrased it, your comment lends itself to a double interpretation.

    Over the years of watching the ID/creationist’s war on education, one can certainly get the impression that they think intelligent people must be violating something; and they don’t like it.

  29. Joe,

    Apologies for the long delay in responding.  I stopped visiting skeptical zone for  a few weeks because:

    1.  my confrontation with Granville was a sad day in the ID community, I needed a break from it

    2.  I wanted to learn more about statistical mechanics before I said any more.  It was clear from my interaction with the physicists here I didn’t know much about the topic (yet as a matter of conscience I had to offer dissent with Granville from what seemed obvious problems with his thesis).

    I have since studied more since my July 4th posting, but there is only so much one can learn in a few short weeks of incidental (not professional) study.

    I hope to work on Mike Elzinga’s test on thermodyanmics and will post later on it as it is a good exercise for anyone discussing these topics.

    Any way, with respect to your question:  “As fascinating as all these offshoot topics are … Sal, did Granville Sewell ever clarify exactly what version of the Second Law is the one he thinks does contradict evolution?”

    The only one I’m aware of is the one Oleg cited.  What is disconcerting as that for most definitions of entropy, I’ve yet to find “disorder” EVER used in upper level texts to describe entropy.  Uncertainty or unpredictablity are perhaps more appropriate terms, but there is a subtle but distinct difference between “disorder” and “uncertainty”.

    The book on Statistical Mechanics that I’m working from is:

    I certainly don’t understand much of what is in the book, and that is why I look forward to discussing and learning more about entropy.   
    I worked through examples of calculating entropy through:

    1. classical thermodynamics using heat capacity and temperature
    2. micro-canonical ensemble (like Mike Elzinga’s example)
    3. canonical ensemble and partition function
    4. grand canonical ensemble and grand partition function

    There was never a mention of disorder in these formulations. 

    Hence, I stand by my original claim, the 2nd law is not an argument against evolution and neither of the main formulations of 2nd law (like the Clausius Postulate or the Kelvin Planck postulate) which lead to the standard notions of entropy as seen in textbooks like Pathria and Beale or Fermi and Zemansky.


    There was only one generalization of the 2nd law that I’m aware of and that is in cosmology:




    It surely has nothing to do with evolution, and that generalization wouldn’t help Granville’s argument, imho.        


  30. Uncertainty or unpredictablity are perhaps more appropriate terms …


    First learn to count and then what to count. Counting and summing have nothing to do with order or uncertainty or unpredictability.

    Look at the links Patrick provided and follow the discussions.

  31. Uncertainty or unpredictablity are perhaps more appropriate terms …


    For thermal entropy, the number of microstates consistent with a given energy implies that there is less certainty in knowing exactly which microstate the system is actually in.

    From wiki:

    The interpretation of entropy in statistical mechanics is the measure of uncertainty

    Obviously I agree with you about counting microstates (and I certainly could use practice doing that as you can see from my errors above).

  32. Don’t try to quote mine every possible source that just might possibly agree with what you want entropy to be about. You will never understand the concept if you never learn how to calculate it.

    There are only a limited number of ways to calculate entropy and none of them has anything to do with disorder, uncertainty or unpredictability.  There is nothing in any of the mathematical formulas that tell you anything about disorder, uncertainty, or unpredictability.  You will find that out if you learn to calculate entropy and then try to ask WHAT is “disordered” or “uncertain” or “unpredictable.”  It’s not what you apparently think it is.

    That is why there are these little concept tests that instructors offer students to encourage thought and discussion.  You aren’t going to learn what entropy is by scouring the internet and studying philosophy.  There is too much useless crap to sort through.

    Look at the links Patrick provided; forget about Wikipedia.  I suspect the textbooks are beyond your ability at the moment.  What you need to learn is far more basic.

    Sit down and think through that concept test.  There are answers are in those links.  The two-state system is important, especially when it is compared with the Einstein oscillator model of a solid. Learn the definition of temperature also.

  33. Ok Mike, we’ll never agree  on somethings, but I will do as you suggest.  But fwiw, that was not a quotemine.  The “uncertainty function” was what von Neumann called from Boltzman’s H-theorem.  Shannon incorporated it his definition of information.

    From Wiki on H-theorem:

    H is a forerunner of Shannon’s information entropy. The article on Shannon’s information entropy contains a good explanation of the discrete counterpart of the quantity H, known as the information entropy or information uncertainty

    and again from Wiki on entropy:

    The most general interpretation of entropy is as a measure of our uncertainty about a system. The equilibrium state of a system maximizes the entropy because we have lost all information about the initial conditions except for the conserved variables; maximizing the entropy maximizes our ignorance about the details of the system.[24] This uncertainty is not of the everyday subjective kind, but rather the uncertainty inherent to the experimental method and interpretative model.

    That was hardly a quote mine.   

    I’ll review Patrick’s links.  I had not reviewed them till now because I was busy with Pathria and Beale.

  34. When you copy/paste stuff you don’t have the background to understand, interpret, or vet in order to support an argument you are making, it is essentially quote mining.

    You are not yet at a level of basic understanding where you can understand how these words are being used or misused. You certainly have no feeling for the history of these concepts and where those other forms of “entropy” came from and why.  So don’t do it; break the habit.  You can’t fool an expert; and I don’t play those kinds of games.  You either know it or you don’t.

    Work through the concept test as it is posted; but instead of doing the test for just the few steps asked for, do it for all steps from zero energy up to fully populated states.  It is not hard if you have a scientific calculator, a graphing calculator, or Excel.  If you have things like Maple, Mathematica, or MathCad, you can use these as well.  Calculate the temperatures as well.

    Make some graphs of your results.  When you get this right, then we can discuss it. (Hint: there are answers in those links; think them through.).

  35. Latex test, PLEASE IGNORE

    latex \pi

    latex \Delta S = \int \frac{\delta q}{T}=\int \frac{0}{T}=0

    latex {\LARGE \Delta S =k\left[N_{1}\ln \frac{V_{1}+V_{2}}{V_{1}}+N_{2}\ln\frac{V_{1}+V_{2}}{V_{2}}\right] }

    The quadratic equation

        \[ax^2 +bx + c = 0\]

    has solutions

        \[x = \frac{-b \pm \sqrt{b^2 -4ac}}{2a}\]

    provided a \ne 0.

    Blank lines separate paragraphs. Use dollar and double-dollar to delimit in-line and display mathematics.

    f=b_o+\frac{a_1}      {b_1+\frac{a_2}{b_2+\frac{a_3}{b_3+a_4}}}

    concept test amendment 1




    Not sure I understand what this means. What’s “thermal entropy” and what’s
    “total entropy”? Can you give an example where the two notions are distinct?

    The material science book by Gaskill makes this distinction. I provided a link to the chapters available online in his book. He teaches at Purdue. If the terminlogy is erroneous, I would welcome a correction, but that seems to be a common engineering convention in material science. Walter Bradley (also a materials scientists like Gaskil) uses exactly the same convention that Gaskill uses.

    Here is my understanding, and please correct me if this is wrong. I have serious issue with Mike’s statement:

    the entropy of a system has nothing to do with its spatial configuration

    whereas you said (and which I fully agree):

    So counting microstates is done with consideration of the particle’s physical location. By definition. There is no point in pinning Mike on this. This is one of the basic points people learn at the beginning of a stat mech course.

    My own thinking to explain Gaskill’s distinction and reconcile entropy being defined by spatial configuration follows below.

    From the classical approximation (which is corrected by quantum mechanics), there is a phase space, as you said, defined by momentum and position.

    In classical mechanics, the state of a particle is defined by position and momentum, whereas in quantum mechanics it is different. However Gibbs was able to make a good approximation using classical mechanics. And I’ll first start with the classical approximation before going into the quantum mechanical description since that helps give a conception. Even your pendulum example takes advantage of the classical definition of phase space.

    For a single particle the description of the microstate is given by position and momentum. But becuase in classical mechanics the position is continuously defined rather than quantized, Gibbs sliced up the position by defining the position in terms of slices or small bins. The slices had length of some arbitrary value (let us call it h, but not the h of Planck’s constant in this case) and the bins had volume of h^3. Thus it became possble to count a finite number microstates.

    So if we have a particle in a volume, the number of small bins the particle can be found in is:

    latex \Omega_{Conf_{1}} = \frac{V}{h^3}

    where the subscript Conf1 is indicating this pertains to the spatial configurational component of the microstate for particle 1. For N paritcles:

    latex \Omega_{Conf} = \Omega_{Conf_{1}} \Omega_{Conf_{2}} \cdots\ \Omega_{Conf_{N}} =  \left[\frac {V} {h^3} \right]^N

    Gibbs had to make a similar trick for putting momentum in bins so that the number of microstates would be finite.

    Because energy can be defined independently of position, the total number of microstates is the number of possible spatial configurations multiplied by the number of energy configurations, symbollically it can be expressed as:

    latex \Omega = \Omega_{Th}\Omega_{Conf}

    Where the subscript Th relates to the energy aspect of the mirostate

    Using Boltzman’s formula for the Total entropy S:

    latex S = k_{B} \ln \Omega = k_{B}\ln \left[\Omega_{Th} \Omega_{Conf} \right] = k_{B}\ln \Omega_{Th} + k_{B} \ln \Omega_{Conf} =  S_{Th}+S_{Conf}

    However noting:
    latex \frac{\partial S_{Conf}}{\partial E} = \frac{\partial}{\partial E} \left[ k_{B} \ln \Omega_{Conf} \right] = \frac{\partial}{\partial E} \left [ k_{B} \ln \left[\frac {V} {h^3} \right]^N \right]= 0

    We are able to recover the notions of classical thermodynamics being about the distribution of energy among microstates without consideration to spatial configuration. For example:

    latex \frac{1}{T} = ( \frac{\partial S}{\partial E} )_{N,V} = (\frac{\partial S_{Th}}{\partial E}+\frac{\partial S_{Conf}}{\partial E})_{N,V}=(\frac{\partial S_{Th}}{\partial E}+0)_{N,V}=(\frac{\partial S_{Th}}{\partial E})_{N,V}

    But whether my derivation is wrong, fundamentally you are right and Mike is wrong regarding his erroneous claim:

    Entropy is about ENERGY microstates. If it’s not about energy, it’s not about entropy.

    I accept everything you say about me, that my understanding is tenuous, that what I may state is gibberish, that I don’t know anything about physics, etc. I respect and accept you judgement on these matters. However, I fully agree with you that what Mike said is wrong. You are right, and he is wrong.

    Finally, I provided the classical approximation as distilled from Gibbs ideas, but since the energy aspect in the counting of the micrstates is separable from the spatial aspects of the counting of microstates, it stands to reason we can simply resort to counting the ways that energy is distributed in a system and drop the spatial aspects if what we are after is related only to energy. Such would be the case when making derivations related to temperature.

    Regarding quantum mechanics, since the Hamiltonian operator will extract the energy values from the quantum states of the system, we simply count the microstates with no reference to spatial considerations if energy is the conserved quantity of interest, and thus this will amend the errors introduced by the classical approximation. Not only do quantum statisitics give the correct results (compared to the classical picture), it is sometimes easier to do the counting since the finite number of microstates comes directly from the nature of quantum mechanics and hence Gibbs tricks to force-fit classical mechanics to thermodynamics are no longer needed.

    Thank you very much for your detailed reply and corrections of my misunderstandings. It was very generous of you to offer so much time to this topic.

    For the record, the professor who tried to teach me this over the summer did not use the distinction of Thermal and Configurational entropy, but since the term has popped up in relation to Walter Bradley’s work (and that of other materials scientists like Gaskill) I thought I would explore it. Also for the record, I was not taught entropy in terms of order and disorder, but simply in terms of the logarithm of microstates.

    The professor reviewed the Liouville theorem and it was in the class text (Pathria and Beale chapter 2). My misunderstanding in this discussion are surely not his fault, or that of the class text, but the class is for engineering students with limited physics background.

    I was the only one with some background in Classical Mechanics, General Relativity, and QM. The class was 12 weeks, and most (Electrical and Computer Engineers) started with no knowledge of classical thermodynamics. The final lecture was on the Bekenstein bound and Hawking radiation, the Tollman-Oppenheimer model, but he decided not to put that on our take home final since it was obviously above our heads. He just skimmed over the topics for fun.

    I’m a financeer, no longer a practicing engineer, and certainly not a physicist. So thank you very much for taking time to educate the holes in my knowledge, and it will help me post on this topic in the future. As you can see, with regard to Granville Sewell, I tried to be faithful to what I learned in class. And I posted my dissent to Sewell at the risk of hurting my standing among creationists. But that’s Ok, I’m for learning and passing on accurate understanding of mainstream literature. Even if I may hold out some disagreement on some matters, it’s no excuse for not understanding what is in the mainstream.

    There is no separate Ω configurational and Ω “thermal” (momentum-space, I presume) in this case. Neither is an invariant measure on its own. So entropy of an ideal gas S turns out be

     S = N ln{V} + (3N/2) ln{E}

    Agreed and thank you for the correction. The way I compartmentalized in my post above was inappropriate, and I had only a guess at how Gaskill arrived at his derivation.

    It would appear that Gaskill (and other material scientists) separate the portions of entropy equations that have a dependence on momentum (or energy) and call this portion “thermal” entropy, and the rest “configurational” entropy.

    Adapting from Pathria and Beale for a single particle with 3 degrees of freedom

    \Omega(N=1) = \Sigma (E)  \approx \frac{V}{h^3} \frac{4 \pi }{3}  (2mE)^{3/2}

    I would presume (perhaps incorrectly?) that for N particles of the same species:

    \Omega(N)   \approx \left[\frac{V}{h^3} \frac{4 \pi }{3}  (2mE)^{\frac{3}{2}}\right]^N = \left[\frac{V}{h^3}\right]^N \left[ \frac{4 \pi }{3}\right]^N \left[ (2mE)^{3/2}\right]^N = V^N E^{\frac{3N}{2}} \left[\frac{4 \pi (2m)^{\frac{3}{2}}}{h^3}\right]^N

    Anyway, this would reduce to

    S=k_{b} \ln \Omega(N) = k_{b} (3N/2) \ln E +  k_{b}  N \ln V + k_{b}  N \ln \frac{(2m)^3/2 4 \pi}{h^3}
     = k_{b} (3N/2) \ln E +  k_{b} N \ln V  + C

    which agrees with your result for an ideal gas if Boltzmann’s constant is stated in natural units of kb = 1.

    It would appear Gaskill, for this case, would simply say:

    S(N)_{th} =  k_{b} (3N/2) \ln E


    S(N)_{Conf} =  k_{b} N \ln V + C

    but this is only a guess on my part, based on what I read. If so, I’d have to amend the derivation above as:
    latex \frac{\partial S_{Conf}}{\partial E} =  \frac{\partial}{\partial E} \left [ k_{B} \ln \left[V\right]^N + C \right]= 0

    but that is a guess.

    That engineering convention of

      S_{total}=S_{thermal} + S_{configurational}

    seems to cause indigestion among physicists, and for the record, that is not how I was taught it, but the convention does appear in some literature even in Universities. I do not know, off hand, if such separability is always possible in principle.

    The notion is stated in Wiki as:

    In statistical mechanics, configuration entropy is the portion of a system’s entropy that is related to the position of its constituent particles rather than to their velocity or momentum. It is physically related to the number of ways of arranging all the particles of the system while maintaining some overall set of specified system properties, such as energy.

    But the term does not appear in Pathria and Beale, and the professor never used these terms (most of his lecture material was from Fermi and Zemansky).

    Anyway, using the formula:
    S = k_{b} (3N/2) \ln E +  k_{b} N \ln V  + C

    in the context of the previous discussion of two species of gases (1 and 2) in thermal equilibrium initially separated by barrier:

    S_{initial} = k_{b} (3N_{1}/2) \ln E_{1} +  k_{b} N_{1} \ln V_{1}  + C_{1}  + k_{b} (3N_{2}/2) \ln E_{2} + k_{b} N_{2} \ln V_{2}  + C_{2}
    =    k_{b} N_{1} \ln V_{1}+ k_{b} N_{2} \ln V_{2}  + k_{b} (3N_{1}/2) \ln E_{1}+k_{b} (3N_{2}/2) \ln E_{2}   + C

    After mixing
    S_{final} = k_{b} N_{1} \ln( V_{1}+ V_{2})+ k_{b} N_{2} \ln (V_{1}+V_{2})  + k_{b} (3N_{1}/2) \ln E_{1}+k_{b} (3N_{2}/2) \ln E_{2}   + C

    Thus for an adiabatic isothermal process for the mixing of two ideal gases of different species:

    \Delta S= S_{final}- S_{initial} = k_{b}\left[N_{1}\ln \frac{V_{1}+V_{2}}{V_{1}} +N_{2}\ln\frac{V_{1}+V_{2}}{V_{2}}\right]

    the formula I provided above, and hence entropy changes with changes in spatial configuration.

    But as you said, I’m tilting at windmills with this example, since the number of microstates available to a single particle depends on the phase space of the particle which depends on the physical space available to the particle.

    I’m not at all saying that entropy is about order or disorder, but I have consternation about saying entropy is only about energy with no consideration for spatial configuration.

    [below is my understanding so far, corrections are welcome]
    My understanding of entropy is that it is the logarithm of microstates, or as Olegt put it:

    Entropy is merely the logarithm of the phase-space “volume” available to a physical system.

    Symbolically, Boltzmann and Planck described entropy in the following manner through statistical mechanics:

    S = k_{b} \ln \Omega

    where S is the entropy \Omega is the number of microstates and k_{b} is Boltzmann’s constant.

    By way of extenstion, the change in entropy can be related to the change in the number of microstates:

    \Delta S = S_{final}-S_{initial}=k_{b} \ln \Omega_{final} - k_{b} \ln \Omega_{initial}

    Perhaps one thing to note is that the number of microstates increases with the number of particles involved (all other factors being held equal).

    Contrasting the description of entropy in statistical mechanics we have the description of entropy in classical thermodynamics (pioneered individuals like Joule and Claussius):

    \Delta S=\int \! \frac{\delta Q}{T}

    where T is temperature and Q is heat.

    My understanding is that the breakthrough was tying classical thermodynamics (as pioneered by individuals like Clausius) to statistical mechanics (pioneered by individuals like Boltzmann and Gibbs). Symbolically:

    \Delta S=\int_{initial}^{final} \! \frac{\delta Q}{T} = k_{b} \ln \Omega_{final}- k_{b} \ln \Omega_{initial}

    in the appropriate domains. There are some examples where this equality will not hold, such as the entropy of mixing, but that the two notions could ever be related was remarkable. Some have likened this breakthrough linking thermodynamics to mechanics as a breakthrough comparable to Einstein relating energy to mass and the speed of light:

    E = mc^2

    To help dispel the notion that increasing entropy implies increasing disorder, let us take familiar examples:

    1. An embryo developing into an adult. As the individual acquires more weight (mass) it acquires more entropy since the number of microstates available to the individual increases with the number of particles in the individual, and as the number of his microstates increases so does his entropy. So as the individual transforms dead nutrients into more biologically “ordered” molecules, his entropy increases! In fact, trying to reduce his entropy make kill him (via starvation).

    2. factories which make computers and automobiles and airplanes. As energy is put into the manufacture of these “ordered” systems, entropy of the universe of necessity goes up.

    Examples of order increasing as entropy increases abound. Increase in entropy, in many cases, is a necessary but not sufficient condition to create “order” (however one wishes to define the subjective notion of “order”).

    In sum, it could be argued that existence and increase biological order are correlated with increases in universal entropy, and even local entropy, hence the 2nd law of thermodynamics cannot be used as an argument against evolution because clearly increases in order are often accompanied by increases in entropy.

    Mike Elzinga for the increasing weight example.

  36. It’s a little easier on this site to just use common math symbols in HTML. Enter your equations in the HTML edit box.

    It means a little extra typing, but at least everybody can read it. You shouldn’t need to get into much fancy math anyway. Entropy is not that difficult.

    Elizabeth tried to get the math editing to work, but it ended up making it hard to get the other tools to work for everybody’s browser. We fell back to this.

    By the way, you may find the HTML edit box has only a very narrow line at the lower right hand corner of the box.  It is easy to miss.  Clicking on that line puts the HTML stuff in the reply box.  You may then have to reformat the paragraphs depending on your browser.

  37. OK Mike,

    I slapped something together.  

    I earlier said something wrong, I said your example was a microcanonical   example.  That is incorrect, it is simple counting.

    When I posted this thread on July 4  I had only been studying statistical mechanics for two weeks.  I’ve had a few more weeks to get a little more study of the topic.

    Here is a link to my MS Word 97-2003 file with the equations in Math Type.  I was extremely verbose so as to help you pick out any flaws in my reasoning and plus this is material I’d like to share with others in the creationist community after I clean up the document.  

    I provided a graph as you suggested.  Let me know if you are unable to access the MS word filed.   I look forward to your corrections and thank you for taking time to teach me. 


  38. Compare your results with this. See also the dialog that follows that comment.

    Plot the temperature versus energy for all values in this concept test.

    Also, take a look at this and this as well.

    Yes, entropy is about counting ENERGY microstates, nothing more; but entropy is extremely useful in thermodynamics and statistical mechanics. But counting is not as easy as it looks. One also has to take into account the nature of the particles that occupy those ENERGY states. That eventually gets us into Fermi-Dirac and Bose-Einstein statistics in addition to the classical Boltzmann distribution. But none of that will make any sense unless you have it down pat what it is you are counting without conflating it with anything else.

    We can discuss those conflations after you have absorbed – and I mean REALLY absorbed – the points of these simple examples.

    After you have had a chance to mull over those other examples in the links above, I will want to find out what you are discovering.

  39. Thank you for taking time to respond.  I got everything correct (thanks in part to Oleg) but had to make a minor amendment. I said the temperature is infinity with 8 excited particles whereas it is more subtle than that.  It is infinity or minus infinity depending on which side the temperature is approached on T(E) vs E graph.

    I wrote some amendments to my original and included the temperature graph you suggested with an updated table to include the Temperatures derived using one-sided partial derivatives.  I also cited and experiment that I think is relevant:


    After you have had a chance to mull over those other examples in the links above, I will want to find out what you are discovering.

    I’ll save a response to that for a little later.  Thanks for reviewing my documents.  Please let me know if I misstated something, and thanks for the information and corrections.   


  40. Energy is the independent variable and entropy is the dependent variable in 1/T = ∂S/∂E.

    When N is large, unit steps in energy packets are infinitesimal relative to N. Then the discrete derivative is simply

    ΔSE = S(N, E+1) – S(N, E),

    where E is the number of unit packets of energy (ΔE = 1).

    For the two-state case, E = 0, 1, … N-1.

    The convention of making temperature increase with energy is a historical accident. Measuring “degrees of heat” goes back to at least Galileo who used a column of expanding wine as a “thermoscope.”

    The definitions of a temperature scale and the unit of heat (energy) were defined independently of each other. Hence the size of the Boltzmann constant is an accident of historical convention (it also depends on the Avogadro/Lochschmidt number). We could just as easily set it equal to 1 and choose other units for temperature and energy. Entropy could also be expressed in units of the number of internal degrees of freedom, or just a dimensionless number.

    There is no reason – other than the “practical” convenience of having temperature increasing with “hotness” – that the “temperature” couldn’t be in units of β = 1/kT. In fact, such a definition of temperature would be continuous in the case of the two-state system as well as in the case of the Einstein oscillator system.

  41. May I add my appreciation to the attempt here and at UD to upgrade the honesty level of the discussion.

    I’ve been at this for many years — at least 14 on the internet — and have no interest in silencing anyone. Nor do I have much hope of convincing anyone on the other side.

    But over the years I think the debate has become more technical and — to me — more interesting and educational. I enjoy working through the puzzles presented by the more informed part of the ID contingent. Some of them are not trivial.

    What I dislike is having discussions terminated by censors or crowded out by know-nothing creationists. But I would rather have prolix creationists than censorship.

  42. If you wonder why I like posting here take a look at the treatment I’m getting at UD for criticizing a creationist and convicted felon, Kent Hovind:


    And it’s not like I totally trashed my side either in my discussion, but the creationists took serious exception  to me posting Thuderf00t’s reasoned criticism of Hovind.

    Though I have hot disagreement with many here, at least we can talk about science. 

  43. Energy is the independent variable and entropy is the dependent variable in 1/T = ∂S/∂E.

    I actually didn’t know that.  When doing derivations, it seemed it didn’t really matter.  I’ve seen in some texts:

    T = ∂E/∂S.

    But I suppose this has to be done with some qualification.

    The convention of making temperature increase with energy is a historical accident. Measuring “degrees of heat” goes back to at least Galileo who used a column of expanding wine as a “thermoscope.”

    The definitions of a temperature scale and the unit of heat (energy) were defined independently of each other. Hence the size of the Boltzmann constant is an accident of historical convention (it also depends on the Avogadro/Lochschmidt number). We could just as easily set it equal to 1 and choose other units for temperature and energy. Entropy could also be expressed in units of the number of internal degrees of freedom, or just a dimensionless number.

    Again, many facts I was completely unware of.  Thank you.  I will have more responses forthcoming….   

  44. After you have had a chance to mull over those other examples in the links above, I will want to find out what you are discovering.

    Per your request, I will respond by first responding in parts in

     that I agree with, and will later post where I disagree separately. 

    So you can’t conclude that bathing things in energy “makes things worse.”

    (2) Entropy can increase from zero with energy input, go through a maximum, and then decrease again to zero as total energy continues to increase. And as energy is drained from the system, entropy can increase from zero, go through a maximum, and then decrease back to zero.


    (3) Entropy has nothing to do with everything coming all apart and “falling into decay” or into “simpler forms.”

    If we are talking thermal entropy yes, but other forms not covered by thermodynamics , maybe not.   More on that later….

    (4) The entropy can change within any system only if the individual constituents of the system can exchange energy with each other. If they could not, then the system would stay in whatever microstate it is in, and there would be only one microstate (entropy zero).


    But such a system cannot “communicate” with the outside world either. And we wouldn’t know what particular microstate it is in (chew on that one, “information wags”).

    No opinion.

    Such a system would be isolated, but the entropy could still be stuck at zero. It is difficult to construct such a system, but they can be closely approximated in the lab.

    We would not be able to do this exercise of n things taken p at a time if it were not possible to have various combinations of atoms containing the same total energy; i.e., if the atoms couldn’t exchange energy with each other.

    No opinion.

    (5) This system is representative of the “population inversions” necessary to produce lasing in a gas laser (such as a HeNe or a CO2 laser for example). It can also apply to “spin systems” of atoms with a nuclear magnetic dipole moment immersed in a magnetic field.

    I was unaware of the laser example.  Thank you.  I was more familiar with the magnetic dipole example (i.e. Pound Purcell 1951).

    (6) ID/creationists know absolutely nothing about entropy.

    Not any more. Thanks to this exchange, one creationist knows a little something. :-)

    (7) None of the ID/creationists understand the concept of temperature, whether it be the empirical temperature or the proper statistical mechanics notions behind temperature.

    I don’t really understand temperature except in terms of experience and the kinds of derivations I did in my responses.  I understand intuitively the rote kinds of temperature such as with a hot brick, but the more exotic concepts like negative infinity temperatures…that’s a little hard to conceptualize outside of the formalisms  I posted.   I can state the formalism, but the intuitive, every day notions of temperature don’t seem to apply when dealing with things like Pound Purcell 1951.

    (8) None of the ID/creationists understand the connections between temperature and entropy or why the entropy

    Hopefully NONE is now at least ONE thanks to this discussion.


    of a system has nothing to do with its spatial configuration or “order/disorder”.

    I’ll post my disagreement later. I agree if we are talking thermal entropy, however.

    (9) None of the ID/creationists understand that entropy has nothing to do with the place an organism occupies on an evolutionary scale.

    Agreed if we are talking Thermal entropy.  I disagreed with Sewell on the inapplicability of the 2nd law to evolution.  Sewell, under duress, had to agree that the Clausius Postulate doesn’t invalidate evolution.  

    For example, no ID/creationist understands that a juvenile animal, with half the dimensions of an adult, contains one-eighth the volume and, therefore one-eighth the entropy. By ID/creationist “logic,” adult animals “have less information” or are “less advanced” then are their juvenile offspring.

    Thanks for reminding me.  S increases with N (and decreases as T decreases). 

    (10) In particular, Sewell’s “paper” is meaningless; he doesn’t know how to calculate entropy or what it is. And we know exactly why he would never consider submitting his “paper” to Physical Review Letters; choosing instead to ferret out an overworked editor with an understaffed set of reviewers working for a small mathematical journal.

    I’m a financeer, not a scientist, but what little I know, I don’t see how it could have passed the Editor’s of Physical Review letters when it would even pass my small level of physics knowledge which was gained from standard Thermodynamic textbooks. 

    There is not one formula used in the calculation of entropy that indicates anything about “order” or “disorder” being calculated. Entropy has NEVER been about disorder. The second law of thermodynamics has NEVER been about everything coming all apart and falling into “decay”. If that were true, what would be left to decay?

    If we are talking thermal entropy, I agree.

    None of the ID/creationist concepts about entropy are found any textbook used in the training of physicists. They never were. Some of the best textbooks around were in print when Henry Morris started his war on evolution in the 1970s. Those textbooks are still in print and are still used today.

    Hopefully that will change, if only to the extent modern ID literature will be crticial of previous creationist literature.  I hope to post what I’ve learned in this discussion as a critique of creationists misuse of the 2nd law.  Maybe at least one science student will benefit.   Even if not, as a matter of conscience, I have to try.  Obvious falsehoods need to be dealt with…

    The ID/creationist misconceptions about decay come from the fact that we living organisms live in a very narrow energy window in which one of our most important compounds, water, is a liquid. We are soft-matter creatures consisting of significant quantities of material that is loosely bound. The temperature boundaries of liquid water are extremes to us.

    No opinion.   

  45. I don’t really understand temperature except in terms of experience and the kinds of derivations I did in my responses. I understand intuitively the rote kinds of temperature such as with a hot brick, but the more exotic concepts like negative infinity temperatures…that’s a little hard to conceptualize outside of the formalisms I posted. I can state the formalism, but the intuitive, every day notions of temperature don’t seem to apply when dealing with things like Pound Purcell 1951.

    Work through the comparison between a two-state system and the Einstein oscillator model of a solid.Find the temperatures and plot them as a function of energy.

    I’ll post my disagreement later.

    Indeed I can predict your objections because I know the misconceptions on which they are based.

  46. Possibly off topic, but I suspect you will soon find out what happens to people who upset the regulars at UD. I will be astonished if you are allowed to continue starting new threads.

    You are now between two worlds. only one of which can readily tolerate  dissenting opinions.

Leave a Reply