Dice Entropy – A Programming Challenge

Given the importance of information theory to some intelligent design arguments I thought it might be nice to have a toolkit of some basic functions related to the sorts of calculations associated with information theory, regardless of which side of the debate one is on.

What would those functions consist of?

Continue reading

In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

Continue reading

Boltzmann Brains and evolution

In the “Elon Musk” discussion, in the midst of a whole lotta epistemology goin’ on, commenter BruceS referred to the concept of a “Boltzmann Brain” and suggested that Boltzmann didn’t know about evolution. (In fact Boltzmann did know about evolution and thought Darwin’s work was hugely important). The Boltzmann Brain is a thought experiment about a conscious brain arising in a thermodynamic system which is at equilibrium. Such a thing is interesting but vastly improbable.

BruceS explained that he was thinking of a reddit post where the commenter invoked evolution to explain why we don’t need extremely improbable events to explain the existence of our brains (the comment will be found here).

What needs to be added is that all that does not happen in an isolated system at thermodynamic equilibrium, or at least it has a fantastically low probability of happening there.  The earth-sun system is not at thermodynamic equilibrium.  Energy is flowing outwards from the sun, at high temperature, some is hitting the earth, and some is taken up by plants and then some by animals, at lower temperatures. Continue reading

Organisms and Machines

In the “The Disunity of Reason” thread, Mung suggested that “the typical non-theist will insist that organisms are machines, including humans.” And there is a long tradition of mechanistic metaphysics in Western anti-theism (La Mettrie is probably the most well-known example). However, I pointed that I disagree with the claim that organisms are machines. I’m reposting my thoughts from there for our continued conversation.

A machine is a system with components or parts that can be partially isolated from the rest of the system and made to vary independently of the system in which they are embedded, but which has no causal loops that allow it to minimize the entropy produced by the system. It will generate as much or as little heat as it is designed to do, and will accumulate heat until the materials lose the properties necessary for implementing their specific functions. In other words, machines can break.

What makes organisms qualitatively different from machines is that organisms are self-regulating, far-from-equilibrium thermodynamic systems. Whereas machines are nearly always in thermodynamic equilibrium with the surrounding system, organisms are nearly always far from thermodynamic equilibrium — and they stay there. An organism at thermodynamic equilibrium with its environment is, pretty much by definition, dead.

The difference, then, is that machines require some agent to manipulate them in order to push them away from thermodynamic equilibrium. Organisms temporarily sustain themselves at far-from-equilibrium attractors in phase space — though entropy catches up with all of us in the end.

It is true that some parts of an organism can break — a bone, for example. But I worry that to produce a concept general enough that both breaking and dying are subsumed under it, one can lost sight of the specific difference that one is trying to explain.

Indeed, that’s the exact problem with Intelligent Design theory — the ID theorist says, “organisms and machines are exactly the same, except for all the differences”. Which is why the ID theorist then concludes that organisms are just really special machines — the kind of machines that only a supremely intelligent being could have made. As Fuller nicely puts it, according to ID “biology is divine technology”.

What is obvious to Granville Sewell

Granville Sewell, who needs no introduction here, is at it again. In a post at Uncommon Descent he imagines a case where a mathematician finds that looking at his problem from a different angle shows that his theorem must be wrong. Then he imagines talking to a biologist who thinks that an Intelligent Design argument is wrong. He then says to the biologist:

“So you believe that four fundamental, unintelligent forces of physics alone can rearrange the fundamental particles of physics into Apple iPhones and nuclear power plants?” I asked. “Well, I guess so, what’s your point?” he replied. “When you look at things from that point of view, it’s pretty obvious there must be an error somewhere in your theory, don’t you think?” I said.

As he usually does, Sewell seems to have forgotten to turn comments on for his post at UD. Is it “obvious” that life cannot originate? That it cannot evolve descendants, some of which are intelligent? That these descendants cannot then build Apple iPhones and nuclear power plants?

As long as we’re talking about whether some things are self-evident, we can also discuss whether this is “pretty obvious”. Discuss it here, if not at UD. Sewell is of course welcome to join in.

2LOT and ID entropy calculations (editorial corrections welcome)

Some may have wondered why me (a creationist) has taken the side of the ID-haters with regards to the 2nd law. It is because I am concerned for the ability of college science students in the disciplines of physics, chemistry and engineering understanding the 2nd law. The calculations I’ve provided are textbook calculations as would be expected of these students.
Continue reading

Entropy and Disorder from a creationist book endorsed by a Nobel Laureate

Here is an online creationist/ID book.
From http://www.ccel.us/gange.app.html#App

“I was particularly pleased with Dr. Gange’s refusal of the idea of materialism, and the convincing arguments supporting that refusal. In fact, the book will be a welcome response to materialism. Good luck, for a good book!”

Eugene Wigner, Nobel Laureate in Physics

The book had an appendix on thermodynamics.

“We noted earlier that entropy can be correlated-but not identified-with disorder. And we said, moreover, that this correlation is valid in only three cases-ideal gases, isotope mixtures, and crystals near zero degrees Kelvin. The truth of the matter is illustrated by considering the two chemically inert gases, helium, and argon.(7) In our mind’s eye we imagine two balloons, one filled with helium and the other with argon. First, we lower the temperature of both balloons to zero degrees Kelvin. This makes all the gas molecules stop moving in either balloon. Next, we get the molecules moving by heating both balloons to 300 degrees Kelvin (room temperature). Were we to mathematically calculate the increase in entropy, we would find that it was 20 percent higher in the argon balloon than in the helium balloon (154 v. 127 joules per mole per degree Kelvin). But since helium molecules are ten times lighter than argon molecules, they are moving three times faster and thus are more disordered. Here, then, is an example where higher entropy is accompanied by lower disorder, thereby demonstrating that we cannot identify one with the other. In the particular example cited, the greater argon entropy comes from the closer quantum translational energy levels identified with its greater molecular mass as described by the Sackŭr-Tetrode equation.
Continue reading

Configuration and Configurational Entropy

From Wiki:

http://en.wikipedia.org/wiki/Configuration_entropy

In statistical mechanics, configuration entropy is the portion of a system’s entropy that is related to the position of its constituent particles rather than to their velocity or momentum. It is physically related to the number of ways of arranging all the particles of the system while maintaining some overall set of specified system properties, such as energy. The configurational entropy is also known as microscopic entropy or conformational entropy in the study of macromolecules. In general, configurational entropy is the foundation of statistical thermodynamics.[1]

Continue reading

A Designed Object’s Entropy often Increases with Its Complexity

[This is an abridged version of a post at UD: A Designed Objects Entropy Must Increase for Its Design Complexity to Increase, Part 2. I post it under a different title at TSZ, because upon consideration, the new title should be above reproach. What I put forward should happily apply to man-made designs. A student recently wanted to challenge his professors regarding the 2nd law and evolution, and I pointed him to my essay. If that student is a creationist, at least I feel I did my job and made him understand science better than he would from most creationist literature. Hence, the rather torturous discussions at TSZ and UD had benefit in furthering this student’s understanding of science. If he is going to reject Darwinism, he should reject it for good reasons, not because of the 2nd law.]

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists. This essay is part II of a series that began with Part 1
Continue reading