Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.
The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.
Then there’s this gem:
But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…
Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.
Sal,
I think you’re trying to say that entropy is a dimensionless QUANTITY. Which is true if it’s expressed using the right system of units. In the SI system, however, entropy is expressed in joules per kelvin, and “joules per kelvin” is not dimensionless.
And as explained above, it wouldn’t help you even if “joules per kelvin” were dimensionless, because Lambert’s definitions don’t lead to dimensionless quantities.
Either way, Lambert’s definitions don’t work. The ‘missing information’ definition is the correct one.
It’s not arbitrary as far as the systems are concerned. If there is no path for energy flow, they are separate systems. Energy flow (bidirectional and balanced in the special case presented) only occurs on contact.
Sure. But all we have presented here is the special case where the systems are in such a state that net change does not occur – when they are already in ‘potential equilibrium’. If the 2 discs had slightly different masses, more energy would flow in one direction than the other. If the gases were at a different temperature or pressure, or chemical composition, likewise. But we haven’t suddenly introduced a ‘dispersal’ situation, we have introduced an asymmetry into the exactly balanced dispersal that pertains when the masses are the same, or the gases identical in all properties.
The ‘exceptions’ seem to me to be situations where equilibrium is pre-specified by the conditions, not truly exceptions at all. It is still possible to see energy becoming delocalised – even if, in the balanced case, there is an exact compensation from the other component of the now-joined system. If one could label the molecules according to their chamber of origin, one would definitely see their energies ‘spreading out’.
keiths:
Allan:
Don’t confuse physical barriers with system boundaries drawn for purposes of analysis. They may or may not coincide. Either way, the second law holds.
Allan:
Actually, no. Suppose that one hoop is three times as tall (and three times as massive) as the other, but that the dimensions are otherwise equal. In both the initial and final states, the taller hoop will have three times the energy of the smaller. Any energy flows between the two hoops will cancel out.
But that’s true even in the case where the gases are identical. Entropy doesn’t increase in that scenario, but the energy from each chamber disperses into the other. Therefore entropy cannot be a measure of energy dispersal.
I believe it is.
Something of dimension one is a dimensionless quantity, a pure scalar.
1 nat = 1
https://en.wikipedia.org/wiki/Boltzmann_constant
1 = 1 nat = 1.380649×10^−23 J/K
1 = 1.380649×10^−23 J/K
1/ 1.380649×10^−23 = J/K
ergo, J/K is a dimensionless UNIT
This suggests
K = Joules * 1.380649×10^−23
First recall,
1 Joule = 6.242e+18 electron Volts
In fact, for many applications, temperature is specified in eVolts (a fraction of a Joule).
https://en.wikipedia.org/wiki/Electron_temperature
That’s sort of the way I remember it when I studied Plasma Physics.
There was a discussion as to why we don’t measure temperature in Joules:
https://physics.stackexchange.com/questions/60830/why-isnt-temperature-measured-in-joules
Sal,
The reason we don’t use energy-based units to express temperature is a historical accident, as I (and Arieh Ben-Naim) explained here.
So recalling what Dr. Mike said, and it echoes a comment about
K = Joules/ bit
Dr. Mike said
K = Energy per degree of freedom
A degree of freedom is bits.
But bits are dimensionless.
So
if K = Joules/bits/ 1.380649×10^−23 = Joules/Degrees of Fredom
J/K = 1/1.380649×10^−23 natural bits , which is still dimensionless
keiths:
Sal:
It isn’t, but it wouldn’t help you even if it were. Lambert’s dispersal-based definitions don’t produce a quantity with units of joules per kelvin, and they don’t produce a dimensionless quantity either.
Entropy is not a measure of energy dispersal. The units are wrong.
1 Joule/Kelvin = 1 / (1.381 x 10-23) / ln (2) Shannon Bits =
1.045 x 1023 Shannon Bits
One of the most respected commenters here, Gordon Davisson agreed:
You redefined Lambert’s notions of dispersal, therefore your criticism is an illegitimate strawman misrepresentation.
In any case, here is the wiki entry on this:
https://en.wikipedia.org/wiki/Entropy_(energy_dispersal)
No, I didn’t. And if Lambert were right, you’d be able to show that his definitions lead to the right units (or lack thereof).
They don’t.
Units matter in science, Sal.
I’m more interested in the mechanistic aspect than the analytical. What really happens is that energy tends to become delocalised – if it can. Don’t forget that Lambert’s main concern is pedagological, and his primary target the notion of ‘disorder’. As he says – and he’s right – that can lead to confusion in the mind of the student, a confusion that recognising the general tendency to delocalisation of energy helps dispel. And careful analysis shows that dispersal still takes place, it’s just that some systems can start at a state of macroscopic equilibrium.
You don’t think there will be any asymmetry? Hmmm. I disagree. The larger hoop will act as a sink; the smaller will get hotter. This heat will flow.
That was actually the case I was considering. You don’t need to label molecules when the gases are different.
Net dispersal, hence entropy change, can be zero. The flow from A into B is exactly balanced by the reverse. Energy dispersal still occurs, it just has no net effect due to that symmetry.
Consider a steel plate separating the discs. First, you just apply one disc. Energy disperses non-controversially – the plate heats up. Then apply both discs to opposite sides. Flow is balanced, but the plate heats up more, and some of that additional heat finds its way back into both discs. They heat each other, via the plate. Now you do the experiment so often that you wear grooves in the plates. Eventually only a sliver remains. Then you break through, such that the discs make direct contact, and you’ve got the Denker scenario. Does ‘bidirectional dispersal’ stop at that point?
Allan,
Actually one of the best description of Entropy as far as pragmatism is the QUALITY of kinetic energy. That may seem strange since kinetic energy is merely
kE = 1/2 mv^2
BUT, consider an ideal monatomic gas at contained in 1 liter of volume at 300 Kelvin temperature. It has certain net kinetic energy from all the molecules (proper term is internal energy).
Well, if we have valve that allows it to expand at the same temperature (isothermally) and without adding or subtracting heat (adiabatically), into a 2 liter volume, it also has the same total kinetic energy (the proper term is Internal Energy, although colloquially some would call it heat energy).
Now it turns out, the situation with the same gas in a liter volume can do more work because it is at higher pressure from the equation:
PV = nRT
So paradoxically it is the same gas with the same total internal kinetic energy doesn’t have the same practical potential energy simply because it is more spread out (has higher entropy).
Entropy then is an index to the quality of the kinetic energy. I would DEFINE entropy that way, but that is one thing entropy helps indicate.
When two chemists told me that independently, it made sense why Clausius concocted the notion, and he did so WITHOUT statistical mechanics, but merely for practical purposes in analyzing steam engines. It was Gibbs, Boltzman and Plank who dared to connect Newtonian mechanics and the ideo of atoms/molecules with Clausius (who may have actually had a caloric view of heat). Amazing, Entropy didn’t need major revision, and neither did Newtonian statistical mechanics with the advent of Quantum Mechanics. If anything, as one of my professors said, Quantum Mechanics made it easier to understand with quantized energies.
ERRATA!!!!
I meant to say:
Ugh!
Apologies to the readers…
Allan,
More later, but for now let me respond to this:
You’re overlooking something: If the initial contents of A and B are identical, then entropy will not change. But if they are different — say, A contains helium and B contains neon, both at the same temperature and pressure — then there will be an entropy change, despite the fact that there is no net dispersal of energy.
Entropy increases, but there is no net dispersal of energy. Therefore, entropy cannot be a measure of energy dispersal.
Allan,
Lambert’s mistake is to root out one myth — entropy as a measure of disorder — only to replace it with another myth — entropy as a measure of energy dispersal. Far better to replace myth with truth. Entropy is a measure of missing information.
keiths:
Allan:
There will be dispersal, but the distribution of energy will end up the same in the final state as it was in the beginning. Net dispersal will be zero, in other words, but entropy will have increased, reinforcing the fact that entropy is not a measure of energy dispersal.
Consider what happens over time:
1. The hoops are rotating. Both the kinetic energy of rotation and the heat energy are evenly distributed (this is why Denker specifies hoops and not discs).
2. The hoops are brought together. The kinetic energy is converted to heat via friction, and it’s concentrated (de-dispersed) at the surface of contact.
3. Heat flows away from the hot areas to the cooler areas.
4. Eventually a state of equilibrium is attained in which the temperature is the same throughout both hoops and energy is distributed evenly, just as it was in the initial state.
The distribution of energy is the same in the initial and final states. Net dispersal is therefore zero. Yet entropy has increased.
My latest comment ended up in the spam queue for some reason.
Mods, could you fish it out?
This was one of the best descriptions so far of entropy:
https://beta.spiraxsarco.com/learn-about-steam/steam-engineering-principles-and-heat-transfer/entropy—a-basic-understanding
If we have a hot brick in contact with a cold brick, kinetic energy from the hot brick spreads out from it, into the cold brick. As this spreading takes place, the entropy of cold brick goes up, the hot brick goes down, but paradoxically the combined entropy of the two bricks goes up!
As general as the Ben-Naim approach is, it is not as intuitive as the simple example of heat spilling (spreading) from a hot body to a cold one. Framing simple phenomenon like ice melting on a hot pan in terms of our ignorance is bordering on KairosFocus word salads!
If we have a single brick with one side that is initially hot and another that is cold, supposing the brick is isolated somehow from outside hot and cold sources. Over time the heat spreads out from the hot spot to the cold spot. A quantitative measure of the amount of spreading over time is the entropy change of the system. SIMPLE!
Explaining this in terms of order/disorder or ignorance isn’t all that helpful, the former is wrong, and the latter, though arguably right, isn’t helpful pedagogically.
Sal,
Not sure why you find that paradoxical. d/dt(x + y) = dx/dt + dy/dt. If dx/dt is positive and dy/dt is negative, then the sign of dx/dt + dy/dt depends on their relative magnitudes.
Intuitively we tend to think
delta x = -delta y
For entropy that is not true because even though
delta x_energy = -delta y_energy
for entropy, the energy must be scaled by temperature, hence
this equality is FALSE
delta x_entropy = -delta y_entropy
Sal:
Fixed that for you.
Sal,
If the wrongness of “entropy as disorder” means we shouldn’t teach it, then why should we teach “entropy as energy dispersal”, which is also wrong?
That is essentially the Clausius formulation of entropy.
Sal,
No, because energy divided by temperature does not give you the right units for energy dispersal.
Sure it does, if you don’t do the calculation the strawman way that you’re doing it.
Sal,
Heh. Okay, show us my “strawman way”, and then show us the right way. Complete with units in both cases.
I’ve done that several times, staring with melting ice cubes, something you couldn’t do with your method of measuring ignorance.
Major goalpost shift noted.
You’re supposed to show that my way of applying Clausius is a “strawman way” that leads to the wrong units for energy dispersal, while your way gives the right units.
Good luck to you in attempting to show that energy divided by temperature is the same thing as energy divided by volume.
Just to hammer the point home, Clausius gives the correct units for entropy — joules per kelvin — while energy dispersal does not.
Oh, gee Keiths, what do you see at about 6:25 in this video:
The heat energy dispersed INTO the ice, divided by temperature.
You’re just strawman redefining what the intended meaning is of advocates of the dispersal description mean. Hey, but if it makes you happy to distort other people’s intended meaning of their own words, then it makes you happy.
Sal,
Of course heat is dispersed into the ice. That’s why it melts.
That doesn’t mean that entropy is a measure of energy dispersal.
The units of energy dispersal are joules per cubic meter. The units of entropy are joules per kelvin. It doesn’t take a genius to see that J/m^3 is not the same as J/K.
But I did.
You never did explain where you got the heat capacity of copper from, nor how you would explain the heat flow in YOUR answer to Dr. Mike’s pop quiz.
You have a history of running away from questions that you cannot answer.
Cue buster and harrumphing.
Sal,
Is this going to be a recapitulation of the earlier thread, in which you started out deriding the missing information interpretation as “deepity woo”? By the end of that thread, you were admitting that it was correct.
From that thread:
Mung wryly commented:
Have you figured out how to draw a straight line yet?
Speaking of Dr. Mike:
But we have Dr. Mike’s postings at panda’s thumb.
http://pandasthumb.org/archives/2013/09/lifes-ratchet-b.html
Spreading? As in dispersal!
That sounds a lot like Lambert:
Hahaha. Take that Keiths and DNA_jock!
Sure I can answer. I was wrong on my entropy calculations with Dr. Mike, but then I learned how to do that one right.
Heat capacity changes with temperature, it is not constant.
I learn, and I admit mistakes, which is more than I can say for you, BECAUSE, you can’t seem to draw a straight line.
I was able to draw one (that blue one), why can’t you? I mean it sort of shatters your silly claims about the graph YOU gave me to look at.
Of course we have the DNA_Jock HOWLER of all time:
I mean, what Does DNA_Jock think this device measures? Uh, like dQ, LOL!
https://www.malvernpanalytical.com/en/products/measurement-type/microcalorimetry
Gee, I wonder why we would want to measure dQ with such a device at a given temperature T if as DNA_jock says:
As I predicted, bluster and harrumphing.
I had forgotten how completely and utterly keiths and I wiped the floor with Sal in that Granville Sewell thread.
Here’s another question that Sal ran away from.
Thanks for the trip down memory lane, Sal.
We’ve been over this before, Sal.
“Dr. Mike” isn’t infallible, and neither is “Dr. Frank” (Lambert).
You’re welcome to ask either of them a) what the units of energy dispersal are, and b) what the units of entropy are. See if the light dawns, on them or on you.
So says the guy who claims:
Howler. Do you feel comfortable saying crap like that before science students?
But whether Keiths or you wipe the floor with me, I appreciate the interaction.
It’s a chance to get editorial corrections for my work. So you get to lambast me.
I get correction, review, and improvement, and you get something that apparently makes you happy. See, what a nice exchange.
Thank, DNA_Jock.
Btw, thanks for interacting with me on Zinc Fingers, you helped me clarify my choice of words and content. My presentation was a resounding success…
So thanks…
Sal has a tell.
😉
You stand by this howler, DNA_Jock?
Alrighty, well thank you and DNA_jock for the conversation.
Sal, to DNA_Jock:
So you claim, but after 2 1/2 years, you still can’t bring yourself to admit that entropy isn’t a measure of energy dispersal.
Yes, I do. Let’s restore the context, which you have always omitted in an attempt to distort my meaning: I was discussing the usefulness (to scientists) of different definitions of entropy, not how one might measure ΔS…
Here’s the post:
My, you are a naughty boy, Sal.
Amusingly, Sal then tried a literature bluff, citing Karplus and Janin, 1999, and shot himself in the foot.
I just spotted another unanswered question:
Something of a show-stopper for a dispersalist.
Well thanks for responding.
Do you think dQ (change of energy) is rarely informative?
How about T (temperature)? Do you think T is rarely informative?
or just
dQ/T is rarely informative.
Here is an example of physicists studying entropy by using calorimetry (as in measuring dQ).
https://phys.org/news/2017-10-entropy-metallic-glasses.html
But according to DNA_Jock,
🙂