He’s baaack

Granville Sewell has posted another Second Law screed over at ENV. The guy just won’t quit. He scratches and scratches, but the itch never goes away.

The article is mostly a rehash of Sewell’s confused fulminations against the compensation argument, with an added dash of tornados running backwards.

Then there’s this gem:

But while Behe and his critics are engaged in a lively debate as to whether or not the Darwinian scheme for violating the second law has ever been observed to result in any non-trivial increase in genetic information…

Ah, yes. Our scheme for violating the second law. It might have succeeded if it weren’t for that meddling Sewell.

153 thoughts on “He’s baaack

  1. Sal,

    I believe entropy is a dimensionless UNIT.

    I think you’re trying to say that entropy is a dimensionless QUANTITY. Which is true if it’s expressed using the right system of units. In the SI system, however, entropy is expressed in joules per kelvin, and “joules per kelvin” is not dimensionless.

    And as explained above, it wouldn’t help you even if “joules per kelvin” were dimensionless, because Lambert’s definitions don’t lead to dimensionless quantities.

    Either way, Lambert’s definitions don’t work. The ‘missing information’ definition is the correct one.

  2. keiths:
    Allan,

    How many systems you have is a function of where you draw the boundar(ies), and where you draw the boundaries is arbitrary.The second law works no matter where the boundaries are placed, and it’s certainly possible to draw a boundary around both spinning hoops so that they form a single system.

    It’s not arbitrary as far as the systems are concerned. If there is no path for energy flow, they are separate systems. Energy flow (bidirectional and balanced in the special case presented) only occurs on contact.

    But the entropy doesn’t increase when the gases are identical.It only increases when they are distinguishable.

    Sure. But all we have presented here is the special case where the systems are in such a state that net change does not occur – when they are already in ‘potential equilibrium’. If the 2 discs had slightly different masses, more energy would flow in one direction than the other. If the gases were at a different temperature or pressure, or chemical composition, likewise. But we haven’t suddenly introduced a ‘dispersal’ situation, we have introduced an asymmetry into the exactly balanced dispersal that pertains when the masses are the same, or the gases identical in all properties.

    The ‘exceptions’ seem to me to be situations where equilibrium is pre-specified by the conditions, not truly exceptions at all. It is still possible to see energy becoming delocalised – even if, in the balanced case, there is an exact compensation from the other component of the now-joined system. If one could label the molecules according to their chamber of origin, one would definitely see their energies ‘spreading out’.

  3. keiths:

    How many systems you have is a function of where you draw the boundar(ies), and where you draw the boundaries is arbitrary.

    Allan:

    It’s not arbitrary as far as the systems are concerned. If there is no path for energy flow, they are separate systems.

    Don’t confuse physical barriers with system boundaries drawn for purposes of analysis. They may or may not coincide. Either way, the second law holds.

    Allan:

    If the 2 discs had slightly different masses, more energy would flow in one direction than the other.

    Actually, no. Suppose that one hoop is three times as tall (and three times as massive) as the other, but that the dimensions are otherwise equal. In both the initial and final states, the taller hoop will have three times the energy of the smaller. Any energy flows between the two hoops will cancel out.

    It is still possible to see energy becoming delocalised – even if, in the balanced case, there is an exact compensation from the other component of the now-joined system. If one could label the molecules according to their chamber of origin, one would definitely see their energies ‘spreading out’.

    But that’s true even in the case where the gases are identical. Entropy doesn’t increase in that scenario, but the energy from each chamber disperses into the other. Therefore entropy cannot be a measure of energy dispersal.

  4. Keiths:

    “joules per kelvin” is not dimensionless.

    I believe it is.

    Something of dimension one is a dimensionless quantity, a pure scalar.

    1 nat = 1

    https://en.wikipedia.org/wiki/Boltzmann_constant
    1 = 1 nat = 1.380649×10^−23 J/K

    1 = 1.380649×10^−23 J/K

    1/ 1.380649×10^−23 = J/K

    ergo, J/K is a dimensionless UNIT

    This suggests

    K = Joules * 1.380649×10^−23

    First recall,

    1 Joule = 6.242e+18 electron Volts

    In fact, for many applications, temperature is specified in eVolts (a fraction of a Joule).

    https://en.wikipedia.org/wiki/Electron_temperature

    The SI unit of temperature is the kelvin (K), but using the above relation the electron temperature is often expressed in terms of the energy unit electronvolt (eV). Each kelvin (1 K) corresponds to 8.6173324(78)×10−5 eV; this factor is the ratio of the Boltzmann constant to the elementary charge. Each eV is equivalent to 11,605 kelvins.

    That’s sort of the way I remember it when I studied Plasma Physics.

  5. Sal,

    There was a discussion as to why we don’t measure temperature in Joules:

    The reason we don’t use energy-based units to express temperature is a historical accident, as I (and Arieh Ben-Naim) explained here.

  6. So recalling what Dr. Mike said, and it echoes a comment about

    K = Joules/ bit

    Dr. Mike said

    K = Energy per degree of freedom

    A degree of freedom is bits.

    But bits are dimensionless.

    So

    if K = Joules/bits/ 1.380649×10^−23 = Joules/Degrees of Fredom

    J/K = 1/1.380649×10^−23 natural bits , which is still dimensionless

  7. keiths:

    “joules per kelvin” is not dimensionless.

    Sal:

    I believe it is.

    It isn’t, but it wouldn’t help you even if it were. Lambert’s dispersal-based definitions don’t produce a quantity with units of joules per kelvin, and they don’t produce a dimensionless quantity either.

    Entropy is not a measure of energy dispersal. The units are wrong.

  8. keiths:
    keiths:

    Sal:

    It isn’t, but it wouldn’t help you even if it were.Lambert’s dispersal-based definitions don’t produce a quantity with units of joules per kelvin, and they don’t produce a dimensionless quantity either.

    Entropy is not a measure of energy dispersal.The units are wrong.

    1 Joule/Kelvin = 1 / (1.381 x 10-23) / ln (2) Shannon Bits =

    1.045 x 1023 Shannon Bits

    One of the most respected commenters here, Gordon Davisson agreed:
    https://uncommondescent.com/physics/gordon-davissons-talk-origins-post-of-the-month-october-2000/

    You redefined Lambert’s notions of dispersal, therefore your criticism is an illegitimate strawman misrepresentation.

    In any case, here is the wiki entry on this:

    https://en.wikipedia.org/wiki/Entropy_(energy_dispersal)

    The concept of “energy dispersal” as a description of entropy appeared in Lord Kelvin’s 1852 article “On a Universal Tendency in Nature to the Dissipation of Mechanical Energy.”[14] He distinguished between two types or “stores” of mechanical energy: “statical” and “dynamical.” He discussed how these two types of energy can change from one form to the other during a thermodynamic transformation. When heat is created by any irreversible process (such as friction), or when heat is diffused by conduction, mechanical energy is dissipated, and it is impossible to restore the initial state.[15][16]

    The energy dispersal concept was advocated by Edward Armand Guggenheim in 1949, using the word ‘spread’.[1] In the mid-1950s, with the development of quantum theory, researchers began speaking about entropy changes in terms of the mixing or “spreading” of the total energy of each constituent of a system over its particular quantized energy levels, such as by the reactants and products of a chemical reaction.[17]

    In 1984, the Oxford physical chemist Peter Atkins, in a book The Second Law, written for laypersons, presented a nonmathematical interpretation of what he called the “infinitely incomprehensible entropy” in simple terms, describing the Second Law of thermodynamics as “energy tends to disperse”

  9. You redefined Lambert’s notions of dispersal, therefore your criticism is an illegitimate strawman misrepresentation.

    No, I didn’t. And if Lambert were right, you’d be able to show that his definitions lead to the right units (or lack thereof).

    They don’t.

    Units matter in science, Sal.

  10. keiths:
    keiths:

    Allan:

    Don’t confuse physical barriers with system boundaries drawn for purposes of analysis.They may or may not coincide.Either way, the second law holds.

    I’m more interested in the mechanistic aspect than the analytical. What really happens is that energy tends to become delocalised – if it can. Don’t forget that Lambert’s main concern is pedagological, and his primary target the notion of ‘disorder’. As he says – and he’s right – that can lead to confusion in the mind of the student, a confusion that recognising the general tendency to delocalisation of energy helps dispel. And careful analysis shows that dispersal still takes place, it’s just that some systems can start at a state of macroscopic equilibrium.

    Actually, no.Suppose that one hoop is three times as tall (and three times as massive) as the other, but that the dimensions are otherwise equal.In both the initial and final states, the taller hoop will have three times the energy of the smaller.Any energy flows between the two hoops will cancel out.

    You don’t think there will be any asymmetry? Hmmm. I disagree. The larger hoop will act as a sink; the smaller will get hotter. This heat will flow.

    But that’s true even in the case where the gases are identical.

    That was actually the case I was considering. You don’t need to label molecules when the gases are different.

    Entropy doesn’t increase in that scenario, but the energy from each chamber disperses into the other.Therefore entropy cannot be a measure of energy dispersal.

    Net dispersal, hence entropy change, can be zero. The flow from A into B is exactly balanced by the reverse. Energy dispersal still occurs, it just has no net effect due to that symmetry.

    Consider a steel plate separating the discs. First, you just apply one disc. Energy disperses non-controversially – the plate heats up. Then apply both discs to opposite sides. Flow is balanced, but the plate heats up more, and some of that additional heat finds its way back into both discs. They heat each other, via the plate. Now you do the experiment so often that you wear grooves in the plates. Eventually only a sliver remains. Then you break through, such that the discs make direct contact, and you’ve got the Denker scenario. Does ‘bidirectional dispersal’ stop at that point?

  11. Allan,

    Actually one of the best description of Entropy as far as pragmatism is the QUALITY of kinetic energy. That may seem strange since kinetic energy is merely

    kE = 1/2 mv^2

    BUT, consider an ideal monatomic gas at contained in 1 liter of volume at 300 Kelvin temperature. It has certain net kinetic energy from all the molecules (proper term is internal energy).

    Well, if we have valve that allows it to expand at the same temperature (isothermally) and without adding or subtracting heat (adiabatically), into a 2 liter volume, it also has the same total kinetic energy (the proper term is Internal Energy, although colloquially some would call it heat energy).

    Now it turns out, the situation with the same gas in a liter volume can do more work because it is at higher pressure from the equation:

    PV = nRT

    So paradoxically it is the same gas with the same total internal kinetic energy doesn’t have the same practical potential energy simply because it is more spread out (has higher entropy).

    Entropy then is an index to the quality of the kinetic energy. I would DEFINE entropy that way, but that is one thing entropy helps indicate.

    When two chemists told me that independently, it made sense why Clausius concocted the notion, and he did so WITHOUT statistical mechanics, but merely for practical purposes in analyzing steam engines. It was Gibbs, Boltzman and Plank who dared to connect Newtonian mechanics and the ideo of atoms/molecules with Clausius (who may have actually had a caloric view of heat). Amazing, Entropy didn’t need major revision, and neither did Newtonian statistical mechanics with the advent of Quantum Mechanics. If anything, as one of my professors said, Quantum Mechanics made it easier to understand with quantized energies.

  12. ERRATA!!!!

    I meant to say:

    I would NOT define entropy that way, but that is one thing entropy helps indicate.

    Ugh!

    Apologies to the readers…

  13. Allan,

    More later, but for now let me respond to this:

    Net dispersal, hence entropy change, can be zero. The flow from A into B is exactly balanced by the reverse. Energy dispersal still occurs, it just has no net effect due to that symmetry.

    You’re overlooking something: If the initial contents of A and B are identical, then entropy will not change. But if they are different — say, A contains helium and B contains neon, both at the same temperature and pressure — then there will be an entropy change, despite the fact that there is no net dispersal of energy.

    Entropy increases, but there is no net dispersal of energy. Therefore, entropy cannot be a measure of energy dispersal.

  14. Allan,

    Don’t forget that Lambert’s main concern is pedagological, and his primary target the notion of ‘disorder’.

    Lambert’s mistake is to root out one myth — entropy as a measure of disorder — only to replace it with another myth — entropy as a measure of energy dispersal. Far better to replace myth with truth. Entropy is a measure of missing information.

    keiths:

    Suppose that one hoop is three times as tall (and three times as massive) as the other, but that the dimensions are otherwise equal.In both the initial and final states, the taller hoop will have three times the energy of the smaller. Any energy flows between the two hoops will cancel out.

    Allan:

    You don’t think there will be any asymmetry? Hmmm. I disagree.

    There will be dispersal, but the distribution of energy will end up the same in the final state as it was in the beginning. Net dispersal will be zero, in other words, but entropy will have increased, reinforcing the fact that entropy is not a measure of energy dispersal.

    Consider what happens over time:

    1. The hoops are rotating. Both the kinetic energy of rotation and the heat energy are evenly distributed (this is why Denker specifies hoops and not discs).

    2. The hoops are brought together. The kinetic energy is converted to heat via friction, and it’s concentrated (de-dispersed) at the surface of contact.

    3. Heat flows away from the hot areas to the cooler areas.

    4. Eventually a state of equilibrium is attained in which the temperature is the same throughout both hoops and energy is distributed evenly, just as it was in the initial state.

    The distribution of energy is the same in the initial and final states. Net dispersal is therefore zero. Yet entropy has increased.

  15. My latest comment ended up in the spam queue for some reason.

    Mods, could you fish it out?

  16. This was one of the best descriptions so far of entropy:

    https://beta.spiraxsarco.com/learn-about-steam/steam-engineering-principles-and-heat-transfer/entropy—a-basic-understanding

    What is entropy?
    In some ways, it is easier to say what it is not! It is not a physical property of steam like pressure or temperature or mass. A sensor cannot detect it, and it does not show on a gauge. Rather, it must be calculated from things that can be measured. Entropy values can then be listed and used in calculations; in particular, calculations to do with steam flow, and the production of power using turbines or reciprocating engines.

    It is, in some ways, a measure of the lack of quality or availability of energy,and of how energy tends always to spread out from a high temperature source to a wider area at a lower temperature level. This compulsion to spread out has led some observers to label entropy as ‘time’s arrow’. If the entropy of a system is calculated at two different conditions, then the condition at which the entropy is greater occurs at a later time. The increase of entropy in the overall system always takes place in the same direction as time flows.

    That may be of some philosophical interest, but does not help very much in the calculation of actual values.

    If we have a hot brick in contact with a cold brick, kinetic energy from the hot brick spreads out from it, into the cold brick. As this spreading takes place, the entropy of cold brick goes up, the hot brick goes down, but paradoxically the combined entropy of the two bricks goes up!

  17. As general as the Ben-Naim approach is, it is not as intuitive as the simple example of heat spilling (spreading) from a hot body to a cold one. Framing simple phenomenon like ice melting on a hot pan in terms of our ignorance is bordering on KairosFocus word salads!

    If we have a single brick with one side that is initially hot and another that is cold, supposing the brick is isolated somehow from outside hot and cold sources. Over time the heat spreads out from the hot spot to the cold spot. A quantitative measure of the amount of spreading over time is the entropy change of the system. SIMPLE!

    Explaining this in terms of order/disorder or ignorance isn’t all that helpful, the former is wrong, and the latter, though arguably right, isn’t helpful pedagogically.

  18. Sal,

    As this spreading takes place, the entropy of cold brick goes up, the hot brick goes down, but paradoxically the combined entropy of the two bricks goes up!

    Not sure why you find that paradoxical. d/dt(x + y) = dx/dt + dy/dt. If dx/dt is positive and dy/dt is negative, then the sign of dx/dt + dy/dt depends on their relative magnitudes.

  19. , then the sign of dx/dt + dy/dt depends on their relative magnitudes

    Intuitively we tend to think

    delta x = -delta y

    For entropy that is not true because even though

    delta x_energy = -delta y_energy

    for entropy, the energy must be scaled by temperature, hence

    this equality is FALSE

    delta x_entropy = -delta y_entropy

  20. Sal:

    Intuitively we I tend to think

    delta x = -delta y

    Fixed that for you.

  21. Sal,

    Explaining this in terms of order/disorder or ignorance isn’t all that helpful, the former is wrong, and the latter, though arguably right, isn’t helpful pedagogically.

    If the wrongness of “entropy as disorder” means we shouldn’t teach it, then why should we teach “entropy as energy dispersal”, which is also wrong?

  22. “entropy as energy dispersal”, which is also wrong?

    That is essentially the Clausius formulation of entropy.

  23. Sal,

    That is essentially the Clausius formulation of entropy.

    No, because energy divided by temperature does not give you the right units for energy dispersal.

  24. keiths: No, because energy divided by temperature does not give you the right units for energy dispersal.

    Sure it does, if you don’t do the calculation the strawman way that you’re doing it.

  25. Sal,

    Sure it does, if you don’t do the calculation the strawman way that you’re doing it.

    Heh. Okay, show us my “strawman way”, and then show us the right way. Complete with units in both cases.

  26. I’ve done that several times, staring with melting ice cubes, something you couldn’t do with your method of measuring ignorance.

  27. Major goalpost shift noted.

    You’re supposed to show that my way of applying Clausius is a “strawman way” that leads to the wrong units for energy dispersal, while your way gives the right units.

    Good luck to you in attempting to show that energy divided by temperature is the same thing as energy divided by volume.

  28. Just to hammer the point home, Clausius gives the correct units for entropy — joules per kelvin — while energy dispersal does not.

  29. No, because energy divided by temperature does not give you the right units for energy dispersal.

    Oh, gee Keiths, what do you see at about 6:25 in this video:

    https://youtu.be/UqSTPHEojqk

    The heat energy dispersed INTO the ice, divided by temperature.

    You’re just strawman redefining what the intended meaning is of advocates of the dispersal description mean. Hey, but if it makes you happy to distort other people’s intended meaning of their own words, then it makes you happy.

  30. Sal,

    The heat energy dispersed INTO the ice, divided by temperature.

    Of course heat is dispersed into the ice. That’s why it melts.

    That doesn’t mean that entropy is a measure of energy dispersal.

    The units of energy dispersal are joules per cubic meter. The units of entropy are joules per kelvin. It doesn’t take a genius to see that J/m^3 is not the same as J/K.

  31. stcordova: I’ve done that several times, staring with melting ice cubes, something you couldn’t do with your method of measuring ignorance.

    But I did.
    You never did explain where you got the heat capacity of copper from, nor how you would explain the heat flow in YOUR answer to Dr. Mike’s pop quiz.
    You have a history of running away from questions that you cannot answer.
    Cue buster and harrumphing.

  32. Sal,

    Is this going to be a recapitulation of the earlier thread, in which you started out deriding the missing information interpretation as “deepity woo”? By the end of that thread, you were admitting that it was correct.

  33. From that thread:

    Sal,

    However, though using the term “information” is not incorrect, I’ve stated my preference to avoid that term for thermodynamics because of the confusion it introduces.

    Interesting. Earlier you were referring to it as “deepity woo”. Now you admit that it’s correct. That’s quite a climb-down.

    Also, keep in mind that whether it confuses you is irrelevant. It’s correct, and the energy dispersal view is not.

    Mung wryly commented:

    Consider it a climb up. Next Salvador will be telling us this has been his view for years.

  34. DNA_Jock: But I did.
    You never did explain where you got the heat capacity of copper from, nor how you would explain the heat flow in YOUR answer to Dr. Mike’s pop quiz.
    You have a history of running away from questions that you cannot answer.
    Cue buster and harrumphing.

    Have you figured out how to draw a straight line yet?

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-33/#comment-149377

  35. Speaking of Dr. Mike:

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-4/#comment-145390

    But we have Dr. Mike’s postings at panda’s thumb.

    http://pandasthumb.org/archives/2013/09/lifes-ratchet-b.html

    The only general rules are conservation of energy (first law), the spreading around of energy (second law), energy flows from higher temperatures to lower temperatures (basically the “zeroeth” law), and entropy goes to zero when all energy is drained out of the system (third law).

    Spreading? As in dispersal!

    That sounds a lot like Lambert:

    Entropy change measures the dispersal of energy: how much energy is spread out in a particular process, or how widely spread out it becomes (at a specific temperature).

    http://entropysimple.oxy.edu/content.htm

    Hahaha. Take that Keiths and DNA_jock!

  36. DNA_Jock: You have a history of running away from questions that you cannot answer.

    Sure I can answer. I was wrong on my entropy calculations with Dr. Mike, but then I learned how to do that one right.

    Heat capacity changes with temperature, it is not constant.

    I learn, and I admit mistakes, which is more than I can say for you, BECAUSE, you can’t seem to draw a straight line.

    I was able to draw one (that blue one), why can’t you? I mean it sort of shatters your silly claims about the graph YOU gave me to look at.

  37. stcordova:
    Of course we have the DNA_Jock HOWLER of all time:

    http://theskepticalzone.com/wp/in-slight-defense-of-granville-sewell-a-lehninger-larry-moran-l-boltzmann/comment-page-26/#comment-148207

    I mean, what Does DNA_Jock think this device measures? Uh, like dQ, LOL!

    https://www.malvernpanalytical.com/en/products/measurement-type/microcalorimetry

    Microcalorimetry is used to study reactions involving biomolecules, including interactions between molecules and conformational changes such as protein folding. Applications range from confirming intended binding targets in small molecule drug discovery to the development of stable biotherapeutics.

    These processes are often studied using two calorimetric techniques: Isothermal Titration Calorimetry (ITC) and Differential Scanning Calorimetry (DSC).

    Isothermal Titration Calorimetry (ITC)
    Isothermal Titration Calorimetry (ITC) is used to study the binding behavior of biomolecules. It is an essential tool for drug design and the study and regulation of protein interactions.

    ITC directly measures the heat released or absorbed during a biomolecular binding event. This allows accurate determination of binding constants (KD), reaction stoichiometry (n), enthalpy (∆H), and entropy (ΔS).

    Gee, I wonder why we would want to measure dQ with such a device at a given temperature T if as DNA_jock says:

    dQ/T is rarely informative

  38. As I predicted, bluster and harrumphing.

    I had forgotten how completely and utterly keiths and I wiped the floor with Sal in that Granville Sewell thread.
    Here’s another question that Sal ran away from.
    Thanks for the trip down memory lane, Sal.

  39. But we have Dr. Mike’s postings…

    Spreading? As in dispersal!

    That sounds a lot like Lambert…

    Take that Keiths and DNA_jock!

    We’ve been over this before, Sal.

    “Dr. Mike” isn’t infallible, and neither is “Dr. Frank” (Lambert).

    You’re welcome to ask either of them a) what the units of energy dispersal are, and b) what the units of entropy are. See if the light dawns, on them or on you.

  40. DNA_Jock: dQ/T is rarely informative

    So says the guy who claims:

    dQ/T is rarely informative

    Howler. Do you feel comfortable saying crap like that before science students?

    But whether Keiths or you wipe the floor with me, I appreciate the interaction.

    It’s a chance to get editorial corrections for my work. So you get to lambast me.

    I get correction, review, and improvement, and you get something that apparently makes you happy. See, what a nice exchange.

    Thank, DNA_Jock.

    Btw, thanks for interacting with me on Zinc Fingers, you helped me clarify my choice of words and content. My presentation was a resounding success…

    So thanks…

  41. keiths: We’ve been over this before, Sal.

    “Dr. Mike” isn’t infallible, and neither is “Dr. Frank” (Lambert).

    Alrighty, well thank you and DNA_jock for the conversation.

  42. Sal, to DNA_Jock:

    I learn, and I admit mistakes, which is more than I can say for you…

    So you claim, but after 2 1/2 years, you still can’t bring yourself to admit that entropy isn’t a measure of energy dispersal.

  43. stcordova: You stand by this howler, DNA_Jock?

    dQ/T is rarely informative

    Yes, I do. Let’s restore the context, which you have always omitted in an attempt to distort my meaning: I was discussing the usefulness (to scientists) of different definitions of entropy, not how one might measure ΔS…
    Here’s the post:

    colewd: [ Sal says:] Professionals who work with entropy ( engineers and chemists) use the Claussius [sic] version. The mistake I see in this thread is the insistence there is only one way to define entropy correctly. There is not.

    I am interested to see if others agree with this. The multiple definition issue seems to be the cause of this lively discussion.

    This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts. Clausius / Carnot is used to introduce high school students to entropy using dQ/T to discuss the theoretical efficiency of steam engines (well, ideal gases in pistons), and to explain to teenage chemistry students why endothermic reactions may be favored.

    Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
    And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…
    I could ask my physicist and materials scientist buddies what they use, but I somehow doubt that would change anyone’s opinions…

    There is only one way to define entropy correctly: minus rho log(rho).

    My, you are a naughty boy, Sal.
    Amusingly, Sal then tried a literature bluff, citing Karplus and Janin, 1999, and shot himself in the foot.

    I just spotted another unanswered question:

    Why does a mixture have entropy at absolute zero?

    Something of a show-stopper for a dispersalist.

  44. DNA_Jock stands by this statement:

    dQ/T is rarely informative

    Well thanks for responding.

  45. DNA_Jock

    Yes, I do.

    Do you think dQ (change of energy) is rarely informative?

    How about T (temperature)? Do you think T is rarely informative?

    or just

    dQ/T is rarely informative.

  46. Here is an example of physicists studying entropy by using calorimetry (as in measuring dQ).

    https://phys.org/news/2017-10-entropy-metallic-glasses.html

    “By accurately measuring the amount of this heat that comes from atoms’ configurations and the amount that comes from atoms’ vibrations, we were able to put this controversy to rest for metallic glasses,” Smith says.

    The team first evaluated the vibrational entropy of metals in both glass and crystal forms. To do so, they used intense neutron beams at Oak Ridge National Laboratory in Tennessee to bombard each material, ringing each sample like a bell and measuring how it responded. They also measured the total entropy of the glass and crystal using a technique called calorimetry.

    But according to DNA_Jock,

    dQ/T is rarely informative

    🙂

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.