Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.
The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.
Then there’s this gem:
But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…
Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.
keiths:
J-Mac:
I call ’em as I see ’em.
Exactly. I don’t know why ID/creationists focus on biology like they do when refrigerators shouldn’t work according to their screeds. When they are able to cope with how refrigerators are able to decrease entropy within the fridge, then perhaps they could step up to biology.
I remember once driving with a friend through Amarillo Texas.
I said, “Hey, looks at all those colorful Cadillacs that have formed spontaneously sticking up at of the ground, isn’t that interesting?’
And he was like “Huh?”
And I was like, “What?”
T_aquaticus:
Judging from his article, Sewell would presumably say that refrigerators violate the second law, and that they are able to do so because they are designed by an intelligence. He writes:
The poor guy actually thinks that the second law is being violated continually on earth.
I wonder if Granville is aware that the entropy of rust is much smaller than the entropy of the iron and oxygen that combine to form it.
Another pseudo-violation of the second law, driven by intelligence. God must be running all over the place causing iron to rust. Or maybe it’s Satan.
Note that J-Mac couldn’t go beyond “Knowing your attention span”.
keiths,
What a delightful example! Of course, the statement that Styer ridicules, “Entropy… is why cars rust…” is absolutely correct, just not for the reason that Wedekind thinks. As others have noted on this thread, cars rust because the entropy of the surroundings increases far more than the entropy of the car declines. It’s compensation all the way…
Entropy,
Ha ha. I should have anticipated that, shouldn’t I?
Here’s how I could have made my point while retaining J-Mac’s interest:
🙂
DNA_Jock,
Right.
Yes. And Sewell fails to realize that by denying compensation, he is denying the second law itself. I explained this six years ago in one of his threads at UD:
I believe you.
Thanks for the correction.
Nice to see you still visiting us at TSZ. If you leave here, this site will die!
Keiths,
The Ben-Naim approach is the most generalized, but as far as measurement, it doesn’t necessarily make it “superior” because temperature and volume can be measure in the lab, ignorance can’t be measured so easily. That’s why Dr. Mike liked Clausius definition of entropy.
Dr. Mike used the term “spreading out of energy” and for most chemical and engineering applications, it’s a good approximation of Clausius.
So what is “superior” is in the eye of the beholder.
Rather than using the words “information” or “ignorance”, the term “uncertainty” is a little more accurate, because it conveys what can be MEASURED in principle, not necessarily actual knowledge. IGNORANCE has connotations related to our own mental understanding, but uncertainty conveys the notion of what is measurable in principle independent of our mental faculties.
phoodoo:
Just wait till those Cadillacs get moved to a junkyard where tornadoes can get a hold of them!
Well Rumraket will tell you, anything is possible.
I moved to a new town recently-houses everywhere. I wonder if they were built. Could just be like evolution, who knows?
It happens all the time. I keep writing more than one sentence to the guy/gal and, of course, (s)he doesn’t even understand the only one (s)he has enough attention span to “read.” If that can be called “reading.”
J-Mac,
Did you watch (and understand) the video?
Fair Witness to phoodoo:
Especially if the tornadoes are running backwards, à la Sewell.
Fred said:
Which just goes to show that even bright people can say stupid things at times.
As I’ve argued with BruceS, and fmm, I think the Boltzmann Brain thing is nonsense. I know physicists take it seriously, but I’m no physicist! I can’t envisage the high-entropy state into which these things are supposed to plausibly pop – even by appeal to Very Long Time. If matter can barely move (e.g. a black hole), it’s not going to make ‘STP’ structures, ever. Likewise if it is diffuse.
stcordova,
Have I given you the benefit of my opinion on Hoyle’s unutterably stupid ‘747’ analogy on any of the other dozens of times you’ve brought it up, perchance?
Sal,
You don’t have to measure ignorance in order to use the missing information interpretation of entropy. When you measure thermodynamic variables such as temperature and pressure, you automatically get missing information. After all, temperature and pressure are just averages.
The problem is that energy dispersal fails as a definition of entropy, because there are counterexamples. The missing information interpretation, by contrast, always works.
That’s why I use the phrase “missing information”, which is synonymous with “uncertainty”.
If only that were true. Just imagine what we could do. Heck, we could solve the energy crisis since we could make machines that drive without needing any energy, and could actually produce energy as they were driven. We could condense water and produce electricity at the same time, which would be great for sub-Saharan Africa.
The absolute idiocy of thinking an intelligence can just violate the laws of thermodynamics willy-nilly is hillarious, if it weren’t so sad.
keiths,
Do you have an example of a counterexample?
keiths:
Allan:
Or a counterexample of an example? 🙂
Here’s my favorite, from John Denker:
And a comment from an old thread listing six reasons why entropy cannot be a measure of energy dispersal:
And:
That’s already wrong. First, entropy has increased as the kinetic energy of rotation is transformed to heat in the flywheels and some usable energy is taken up in the exchange of energy. Next, only the surfaces of the flywheels that are touching one another will initially heat up, but that heat will spread to the rest of the flywheel and also be radiated out. That is an increase in entropy.
T_aquaticus,
Denker’s point doesn’t depend on what happens during equilibration.
Define Zi as the initial state of the system before the hoops are in contact, and Zf as the final state after the rotation has stopped and the heat of friction has spread throughout the hoops. Thermodynamic entropy is a state variable, meaning it does not depend on the path the system takes through phase space. Thus it’s irrelevant what happens during equilibration. Zi and Zf are all that matter for determining entropy.
The entropy associated with Zf is obviously greater than the entropy associated with Zi, yet the energy hasn’t spread out any more than before. Entropy therefore can’t be a measure of energy dispersal, as my other points also demonstrate.
Energy dispersal is in PHASE space, not volume space only! GRRR! Dr. Mike and Dr. Lambert acknowledge this.
In that other thread “In Slight Defense of Granville Sewell” I mentioned the Lioville Theorem and phase space.
https://en.wikipedia.org/wiki/Liouville%27s_theorem_(Hamiltonian)
Sal,
That doesn’t even make sense. At any given moment, the system occupies only one point in phase space.
In Lambert’s words:
http://entropysite.oxy.edu/cracked_crutch.html
The microstates are found in phase space, not volume space alone. The greater the entropy, the greater number of microstates that are occupied over time for a given internal energy.
The cold fast spinning disks have fewer microstates than the hot slow spinning disks.
More from Lambert:
If other people what to redefine what Lambert means by “dispersal”, that’s on them, but it doesn’t represent what Lambert meant by dispersal. This applies to the supposed refutation involving spinning disks.
Sal,
The system is only in one microstate at a time, meaning that it occupies a single point in phase space at any given moment. The energy — all of it — is concentrated in that single state. That’s not “dispersal”.
Even setting that aside, you still have a problem with units. Thermodynamic entropy is measured neither in “joules per square meter” nor in “joules per microstate”. It’s measured in joules per kelvin.
Entropy is simply not a measure of energy dispersal. It’s often associated with energy dispersal, but that’s not the same thing. Not by a long shot.
Sal,
Right. More possible microstates are compatible with the high entropy macrostate than with the low entropy macrostate. In other words, it would take a greater amount of additional information to pin down the exact microstate in the high entropy state than in the low-entropy state.
Entropy is a measure of that missing information.
Sal,
It’s not a redefinition. Lambert explicitly includes dispersal in physical space:
I don’t think the spinning discs strictly refutes dispersal though. Before contact, you have two isolated systems. On contact, you have a single system of greater volume. The energy in A ‘flows’ into B, and vice versa. Net energy doesn’t change, but it doesn’t stay where it was either. One is anticipating contact by considering the whole system from the start, but that doesn’t seem right.
It seems equivalent to a mixing scenario. Even with two identical gases, when a barrier exists they are two closed systems. Remove the barrier and the energy in A ‘spreads out’ geometrically, precisely compensated (when the gases are the same) by the parallel spread from B into A.
I’m not hardline on this, but the geometric component – moving energy from a localised setting into a greater volume – seems a fundamental feature in rendering it unavailable for work.
Allan,
How many systems you have is a function of where you draw the boundar(ies), and where you draw the boundaries is arbitrary. The second law works no matter where the boundaries are placed, and it’s certainly possible to draw a boundary around both spinning hoops so that they form a single system.
But the entropy doesn’t increase when the gases are identical. It only increases when they are distinguishable.
That’s further evidence that entropy is not a measure of energy dispersal, since energy disperses in both cases but entropy increases only in one.
Increasing physical space increases number microstates, but so does increasing temperature! You’re inappropriately extrapolating one special case as if that’s the totality of what Lambert is describing.
The increase in the number of microstates can happen by EITHER increase in volume and/or temperature, for example, is brutally obvious in the Sakur-Tetrode approximation for monoatomic ideal gases.
Another potential objection: If thermodynamic entropy is a measure of missing information, then why is it expressed in joules per kelvin?
I addressed that question in the earlier thread:
Sal,
No. I wrote:
Note that I said “includes”, not “includes only”.
Either way it doesn’t help you, for reasons I’ve already given. The energy is not spread among the different microstates; it’s concentrated in a single microstate at any given instant. That’s not dispersal.
And energy dispersal, whether in physical space or phase space, has the wrong units for entropy. “Joules per cubic meter” and “joules per microstate” are not the same as “joules per kelvin”.
You can’t just ignore the units, Sal. That sort of sloppiness might fly in creationism, but not in science. Units matter.
ETA: Corrected “joules per square meter” to “joules per cubic meter”.
I disagree, but thanks anyway for the conversation.
And FWIW, thanks for this thread. I’m sorry to disagree with my friend Granville, but well using 2nd law to defend ID is not wise.
Sal,
You don’t think that units matter when discussing entropy?
Where’s Alan Fox when you need him? At least when he was here, I didn’t have tangle with you as much as now that he’s faded way like General MacArthur.
>A good soldier never dies, he just fades away. — Douglas MacArthur
My question is a reasonable one. If Lambert’s definitions lead to the wrong units for entropy, then his definitions can’t be correct.
We had a conversation on dimensionless UNITs here:
Sal,
You argued, bizarrely, that “joules per kelvin” is dimensionless. Which wouldn’t help your case even if it were true, since Lambert’s definitions don’t lead to dimensionless quantities.
Lambert’s definitions don’t work, Sal. Units matter.
I believe entropy is a dimensionless UNIT. I stated the deduction right here:
I argued quite adeptly if I do say so myself. See my derivation here: