Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.
The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.
Then there’s this gem:
But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…
Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.
“The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument”
What’s the compensation argument exactly?
J-Mac,
It’s the argument that any decrease in the entropy of a system is necessarily accompanied by a compensating entropy increase in the surroundings, so that the second law of thermodynamics is not violated.
Sewell wants to argue that evolution violates the second law, so he hates the compensation argument and tries (endlessly) to show that it’s wrong.
Is this it?
http://www.math.utep.edu/Faculty/sewell/articles/pe_sewell.pdf
It’s one of many. The guy can’t stop repeating himself, no matter how often he gets refuted.
Here’s Sewell’s strawman version of the compensation argument:
As if any of his critics were actually making that claim.
Sewell attacks the compensation argument by implying that it involves flows of energy distant from the decrease of entropy. In the case of biological evolution that is not true: proportions of genotypes change in a population, as a result of births and deaths in that population, and the compensating flow of energy is through those individuals. The Second Law is not violated by increase of more fit individuals in a population.
He also invokes his impressive-looking equations for movement of substances, implying that they endlessly disperse. However his equations are simple diffusion equations with no terms allowing different substances to interact. He leaves out chemistry and all forms of physical interaction, as a close study of the terms in his equations will disclose. In a glass of salty water, the water will ultimately evaporate and the salt will crystallize at the bottom of the glass. His equations do not predict that.
The problem starts the very moment in which entropy is equated with disorder. That’s not entropy.
Then, the compensation “argument” is not an argument, it’s a fact of physics. “Local” reductions in entropy are accompanied by much higher increases in entropy in the “surroundings.” Not only that, it’s that increase in entropy that drives the “local” reduction in the first place. We decrease our “local” entropy by transforming highly available energy sources into much less available energy forms (like heat).
Even the creationists preferred “solution” depends on the very things they try and deny. On the very things they imagine that intelligence can oppose (it cannot). No brain power, no thinking, no planing, no movement, no building of computers or cars or jumbo jets, can happen without those energy transformations, without that “compensation.” If energy didn’t have that tendency to flow from high-to-low availability, then work could not be possible. Entropy is not the problem, entropy is the solution. Entropy drives the whole endeavour.
For a fleeting moment, I thought you were talking about Mr. Fishing Reels and word salads himself, KairosUnFocused.
stcordova,
Sal,
You disagree with Sewell, right?
How exactly is the environment compensating for the loss of entropy?
I tried to explain this to him a long time ago. To be fair many on our side dont realize that the compensation argument is irrelevant. The fact is that complexity within a system can increase even as the entropy increases and even if it is a closed system. If you dont see that just consider that while the earth is an open system, the solar system is essentially a closed system and life arose within it.
J-Mac,
The system “exports” entropy to its environment. If the amount exported is large enough, the entropy of the system will decrease.
The exact way in which this happens depends on the nature of the system and its surroundings.
Technically he’s just flat out wrong. AFAIK statistical mechanics says computers (or any other imaginable material objects) can, in point of fact, appear on barren planets. In fact a similar question is considered a fundamental problem in physics: The spontaneous appearance of Boltzmann brains even from a state of equilibrium.
A comment I made at UD in a discussion of a paper of Sewell’s entitled Entropy, Open Systems and Evolution:
This. If only people could understand this. Designers don’t violate the 2nd law, they are manifestations of it. To arrange those objects and parts into some complex structure, to even think the plan, the design of it up, takes the conversion of available energy into less usable forms and the total entropy is increased. The brain literally runs hot(and so does your muscles), and you have to eat to keep it all running. It’s a big flesh-computer and if you pull the plug it’s temperature equilibrates with the surroundings and your thinking(and any other kind of behavior) stops.
Yes.
I ask a simple question, what has more entropy, a frozen dead rat or a living human? Answer: living human. A 2nd year chemistry, physics, or mechanical engineering student should be able to deduce that by looking at standard molar entropy tables.
I did provide a SLIGHT defense of Dr. Sewell here by pointing out an error in professor Larry Moran’s book, which to his credit, Larry acknowledges is an error which he uses to teach his biochemistry students because the accurate explanation he deems too hard. See the first comment by Dr. Moran himself in his own words:
So how exactly would the system compensate for the loss (while up to small error) of information in this classical translation of the information leading to protein folds?
DNA > mRNA > amino acid chain
stcordova,
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
What happened to the information?
You seem to be confusing sequence information with thermodynamic entropy. That’s about the most sense I can make of what you’re asking.
Are they separate?
J-Mac:
Pay no attention to Sal, J-Mac. While he’s right about entropy not being a measure of disorder, he’s wrong about it not being a measure of information.
It is a measure of information; specifically, the gap between the information specifying the macrostate and the information needed to specify the exact microstate.
I explain this in that very long comment thread.
I don’t know what you mean by separate. When DNA is transcribed to RNA, and then translated into amino acid sequence, those are chemical reactions which means chemical bonds are broken and others are established, which means work is being done. Work takes energy. Chemical bonds being broken and formed results in energy radiating out into the surroundings usually always in large part as heat.
Not really sure what “information loss” you are talking about here, but the answer in any case would be that the system ‘compensates’ by making the surrounding environment slightly warmer.
How much warmer?
Well, if your system lost the total amount of information in the human genome every second, it would need to export (at STP) enough energy to melt 1.1 micrograms of ice per year.
IOW, an miniscule heat flow is sufficient to account for whatever decreases in informational entropy that you might care to imagine.
Goading kairosfocus with this awkward fact used to be a pastime of mine.
[Another way to think about it: in terms of entropy, DNA replication produces a lot more thermal entropy than ‘information’. If it didn’t, it wouldn’t happen.]
Seems like the relevant discipline for discussing ID would be chemistry.
All the pseudo mathematical arguments need to be discussed as chemistry.
For example, how much more energy is required to produce a beneficial mutation, as opposed to a detrimental one.
Just out of curiosity (and because I didn’t come in at the beginning), is Sewell attempting to support a position that evolution cannot happen without supernatural input? That is, is Sewell making a physical argument, or a religious argument?
No. He’s back over at UD trying to argue that the designer cannot be a material being.
For a fleeting moment, I thought you were talking about Mr. Fishing Reels and word salads himself, KairosUnFocused.
Flint,
He doesn’t explicitly mention the supernatural in this particular article, but he claims that what has happened on earth requires violations of “the generalized second law”, and that such violations can only happen via intelligence, which he seems to think is non-physical:
ETA: It reminds me of DaveScot’s claim that he violated the second law every time he typed.
All ID proponents are supporting a religious argument.
In earlier writings (including this paper) Sewell claims that the four fundamental forces are insufficient to explain the eventual appearance of complex artifacts on earth:
“Certainly we would not”, he says, without providing a lick of evidence.
OK. I guess I don’t understand why it wouldn’t just be easier to say that reality is how his god makes his will actual, rather than trying to deny reality to find his god. Seems like Sewell is going to a lot of unnecessary trouble to avoid saying evolution happens because his god WANTS it to happen, and Designed the universe accordingly.
Sewell just goes from particles to supercomputers. Here’s what he fails to mention:
1. Particles aggregating into atoms.
2. Atoms bonding together to form chemicals.
3. Chemicals aggregating into minerals.
4. Those gravitationally attracting into dust clouds.
5. A sun forming.
6. Planets forming.
7. Nuclear fusion igniting in the sun.
8. Lots of geology.
Now even Sewell’s admirers should admit that those can happen by known physics, chemistry, and astronomy. In spite of Sewell trying as hard as he can to make such things sound impossible.
After that the issue is Origin of Life, evolution, with humans originating and developing civilization. We’re all used to arguing about that. After we have civilization the supercomputers are a done deal.
Sewell makes it sound as if he has some new argument involving the impossibility of particles accidentally aggregating into supercomputers. But he doesn’t. It’s the same old stuff.
Yes for the most part.
There may be some adjustment made for the Landaur Principle, but the number of thermodynamic information bits in DNA is astronomically bigger than the information bits in the sequence.
500 fair coins can be viewed as having 500 bits of information in the heads/tails configuration, but it has an astronomically large number of thermodynamic bits if we just take standard molar entropy numbers — like 10^25 bits!!!!
See the calculation here (I made a small math mistake which I didn’t correct, but the numbers are in the ballpark):
Dr. Sewell is conflating textbook thermodynamics with configurations of matter that don’t really contribute to thermodynamic entropy (unless one ties the TRIVIAL contribution using Landauer’s Principle and even then, the result is dubious).
See:
and
Flint,
I think it’s because a non-interventionist God who winds up the world, sits back, and watches things unfold doesn’t jibe well with Christian theology. The Christian God is a god who not only continually fiddles with the world, he enters it himself.
This is why I find it so amusing when colewd refuses to speculate on if the “designer” acts in “real time” or just once at the beginning.
Ah, thanks. I think I see it now — such a non-interventionist god would be indistinguishable from no god at all, which would imply a little too much reality for perhaps any theistic faith.
I asked this question deliberately ☺
Can you expand on that, please?
What do you really mean by the configuration of matter?
Do you view entropy = information, or not?
You wrote : “ Shannon entropy (or measure of information).
J-Mac,
Knowing your attention span, I went looking for a short video explaining why entropy is a measure of missing information, not disorder. The following video explains it well:
CAUTION: There is one major mistake in the video, which is that he multiplies by Boltzmann’s constant when computing the dice entropy. Dice entropy is not thermodynamic entropy, so Boltzmann’s constant should not be used.
J-mac,
Entropy can be defined equivalently in a number of ways, and two of the ways have NO relation to information, and the 3rd way does, but is almost never used in a lab setting.
Clausius indirect definition :
Boltzman-Plank
Information form (GAG!):
Most physics and engineering students and chemistry students use the Clausius definition since it’s measurable in a lab setting.
Keiths likes “Entropy is a measure of ignorance” but measuring ignorance is a lot harder to measure than temperature. 🙂
But in case you’re curious, I have an excel spreadsheet which you can modify to see how entropy changes based on temperature and volume of a monoatomic gas trapped in a container. It’s not very exciting, it’s for physics nerds. I leverages the Sakur Tetrode equation.
This link shows two ways to calculate such Sakur-Tetrode entropy, the old fashioned way, and the Keiths way.
https://en.wikipedia.org/wiki/Sackur%E2%80%93Tetrode_equation
Dr. Mike hated both!
Depends on your definition of information. As far as Designed Information, or information that makes machines, absolutely “NO”!
Note the creation of a new idiosyncratic definition of “information”.
Note the circularity. “Designed” information is information that was designed.
I love you too😊
Thanks o lot!!!
You must be familiar with the the law of conservation of quantum information?!
I’m talking life systems and DNA…
Classical Thermodynamics doesn’t apply, that’s why a living human has more thermodynamic entropy than a dead rat, even though a living human being is a functioning information-filled creature compared to a dead rat.
Classical thermodynamics of course applies to living human beings. The fact that it does not say anything about some other issue is irrelevant to that.
Sal,
The missing information interpretation works for all kinds of entropy, including thermodynamic entropy. That (among other things) makes it superior to the interpretations you prefer, including the energy dispersal interpretation.
And when you measure a thermodynamic variable like temperature, you are automatically missing some information about the microstate of the system. Temperature, after all, is a measure of the average kinetic energy of the molecules in the system. You lose any information about the kinetic energy of specific individual molecules.