2LOT and ID entropy calculations (editorial corrections welcome)

Some may have wondered why me (a creationist) has taken the side of the ID-haters with regards to the 2nd law. It is because I am concerned for the ability of college science students in the disciplines of physics, chemistry and engineering understanding the 2nd law. The calculations I’ve provided are textbook calculations as would be expected of these students.

The fundamental problem is 2LOT is concerned with energy (or position/momentum) microstates, whereas IDists are concerened with “design space” microstates. The number of microstates can both be expressed in information bits, but it does not mean we are dealing with the same microstates. I’m providing sample calculations to prove the point that it is disastrous for IDists to invoke textbook 2LOT for the simple reason 2LOT is concerened with energy (or position/momentum) microstates which has little or nothing to do with “design space” microstates of interest to ID.

I’m going through textbook thermodynamics here. If we have 500 fair copper pennies, how many “design space” microstates are there? Standard ID answer:

2^500

since there are 500 coins and each coin has 2 states, a system of 500 coins then has 2^500 possible symbolic configurational states or microstates. This can also be expressed in bits:

I_design_space = – log2( 1/ (2^500) ) = 500 bits

What is the design space entropy?

I_design_space = S_design_space = 500 bits

IN CONTRAST, how many thermodynamic energy microstates are there in this system of 500 pure copper pennies at standard “room” temperature (298 Kelvin). The textbook style calculation is as follows:

Mass of a copper penny 3.11 grams.
Molar weight of copper 65.546.
Standard molar entropy of copper 33.2 J/K/mol.

Thermodynamic entropy of 500 copper pennies is therefore:

S_thermodynamic = 500 * 33.2 Jolues/Kelvin/Mol * 3.11 grams 65.546 grams/ mol = 826.68 J/K

The thermodynamic entropy in J/K can be converted to bits by simply dividing by Boltzman’s constant and then converting the natural log measure to log-base-2 measure.

Boltzmann’s constant is 1.381x 10-23 J/K).
The natural log to log-base-2 conversion is ln(2) = .693147.

Thermodyamic entropy in bits is computed as follows:

S_thermodynamic = I_thermodynmic =826.86 J/K = 826.68 J/K / (1.381x 10^-23 J/K) / .693147 = 8.636 x 10^25 bits

The number of thermodynamic microstates is simply taking 2 raised to the power of I_thermodynmic

2^(8.636 x 10^25)

which is a GIGANTIC number.

Clearly the design space entropy is not the same as the thermodynamic entropy because the design space microstate is not the same as the thermodynamic microstate.

Now let us heat the coins from room temperature to near boiling of water (373 Kelvin). What is the change in entropy or the number of microstates?

At 373 Kelvin the “design space” entropy is still 500 bits since the possible number heads tails microstates does not change with this increase in temperature.

However the thermodynamic entropy and thermodynamic microstates change. What is the change in entropy? Again using standard textbook thermodynamics.

Specific heat of copper 0.39 J/gram
Heat capcity C of 500 copper pennies:

C = 0.39 J/gram/K * 500 pennies * 3.11 grams/penny/K = 606 J/K

T_initial = 298 K
T_final = 373 K

To calculate the change in entropy I used the formulas from:
http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node41.html

delta-S_thermodynamic = C ln ( T_final/T_initial) = 606 J/K ln (373/298) = 136.13 J/K

Total thermodynamic entropy is calculated as follows:

S_thermodynamic_initial = 826.86 J/K

S_thermodyanmic_final = S_thermodyanmic_initial + delta-S_thermodynamic = 826.86 J/K + 136.13 J/K = 963.0 J/K

Again we can convert this to bits using procedures similar to the above conversions:

S_thermodyanmic_final = 963.0 J/K = 963.0 J/K / (1.381x 10-23 J/K) / .693147 = 1.01 x 10^26 bits

The ADDED number of microstates due to the increase in temperature is calculated as follows:

delta-S_thermodynamic = 136.13 J/K = 136.13 J/K / (1.381x 10^-23 J/K) / .693147 = 1.42 x 10^25 bits

Thus the number of thermodynamic microstates added by heating is simply found by rasing 2 to the power of delta-S_thermodynamic

2^delta-S_thermodyanmic = 2^(1.42 x 10^25)

Adding heat can be said to make the copper molecules bounce around more chaotically (disorderly if you will), and hence increase the thermodynamic entropy and microstates, but it says nothing of the change in design space entropy or microstates.

BOTTOM LINE:

Increasing heat increases the thermodynamic entropy and the individual copper molecules look more chaotic (disorderly if you will) because they are vibrating faster from the added heat, but it does nothing to change the design space entropy.

At 298 Kelvin:

Design Space Entropy: 500 bits
number of Design Space microstates: 2^500

Thermodyamic Entropy: 8.636 x 10^25 bits
number Thermodynamic microstates: 2^(8.636 x 10^25)

At 373 Kelvin by adding heat :
Design Space Entropy: 500 bits
number of Design Space microstates: 2^500
change in Design Space entropy due to heat change : 0 bits
change in number of Design Space microstate due to heat change: 0 microstates

Thermodyamic Entropy: 1.01 x 10^26 bits
number Thermodynamic microstates: 2^(1.01 x 10^26)
change in thermodynamic entropy due to heat change : 1.42 x 10^25 bits
change in number of thermodynamic microstates due to heat change: 2^(1.42 x 10^25) microstates

Moral of the story: don’t use 2lot to argue for design space entropy change. Besides, as pointed out earlier, increasing design complexity usually entails increase of both design and thermodynamic entropy.

Why all this obsession with reducing entropy to increase design complexity? I hope one can see it can be desirable to INCREASE entropy (both design and thermodynamic) in order to increase design complexity. A warm living complex human has more thermodynamic and design space entropy than a dead lifeless ice cube.

118 thoughts on “2LOT and ID entropy calculations (editorial corrections welcome)

  1. so if you pick an appropriate level, it can be informative in a Shannon-like manner

    The better notion is not information ,but lack thereof, it is Shannon uncertainty.

    It is easier to see Shannon in the sender receiver model. A sender gives a 1 gig disc to a receiver.

    The receiver of the information is uncertain of the contents, he has 1 gig bit of Shannon uncertainty as to the contents of the disc. When he examines the disk and extracts the contents, he has reduced the uncertainty by 1 giga bit as he gains 1 giga bit of information. Reducing uncertainty is gaining information, but the measure of uncertainty and information on the disk is still 1 giga bit.

    A monoatomic ideal gas system at a certain Temperature, Pressure, and Volume is in a defined thermodynamic macrostate. That macrostate can be realized by a buzillion different possible microstates. It is uncertain exactly which microstate the system is in at any given time, but it is in 1 out of buzillion of those microstates at any given moment.

    Let W be the number of mcirostates. Then W = buzillion. The Shannon uncertainty of which microstate is system is in at any moment is:

    S_shannon = log 2 W

    Unfortunately, we won’t ever be able to actually tell which actual microstate the system is in because we would need precise information on position and momentum of each particle, but that is precluded by Heisenburg uncertainty. 🙂

    Hypothetically if we did know which microstate the system were in, we would gain a buzillion bits of information. But we’ll just have to be content saying we have a buzillion bits of Shannon uncertainty (Shannon entropy).

  2. Consider a system with 500 molecules of hydrogen in a closed box. All have exactly the same energy and direction – all heads, if you will. This has the maximal capacity to do work. They are all heading to the same side of the box, and would drive a fan if one was available. Gradually, there will be fewer and fewer molecules at this extreme of the energy distribution, as the capacity to do work converts into actual work. Absent the fan, the entropy of the overall system does not change during equilibration

    Such a situation raises the specter of directionally dependent temepratures! That is to say, you’ll measure one temperature based on the particles moving one way, and another based on the particles moving another way. It is occasionally found in plasma physics. One discussion I found was here:

    http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.384.757&rep=rep1&type=pdf

    on page 8.

    While they are all traveling in 1 direction, along that direction the temperature is high in that direction because there is only 1 degree of freedom per molecule vs 3 when they are bouncing around, so the temperature is 3 times higher in that one direction versus the total temperature when it all supposedly randomizes in the box independent of the fan.

    This is a clear case where the definition of temperature is less clear cut because of the directional dependence. I suppose when the system randomizes the direction of the molecules, the usable energy goes down since there is no longer a very hot and very cold direction, and the Carnot efficiency increases with a temperature differential between cold and hot. This is actually a situation where one has a cold reservoir in one direction and a hot one in another!

  3. Sal:

    The SI unit (J/K) and the natural unit for entropy are logarithmic measures of the same number of microstates, yet Keiths swears one measurement measures actual physical dimensions while the other doesn’t.

    You just made that up. I haven’t said — much less “sworn” — anything of the sort.

    Knock it off, Sal.

    Here’s what I actually said, in the very first comment of our exchange:

    The Kelvin is an SI base unit, so J/K is not dimensionless.

    It could have been dimensionless, as Arieh Ben-Naim points out, if the temperature scale had been introduced after the atomic hypothesis was widely accepted, because then temperature could have been defined in terms of the average kinetic energy per molecule. In that case the Kelvin would not have been a base unit, J/K would have been dimensionless, and it would have made sense to define S as ln W, leaving Boltzmann’s constant out of the picture. History unfolded otherwise.

  4. From

    2LOT and ID entropy calculations (editorial corrections welcome)

    Keiths opined:

    S = ln W is true for Shannon entropy, which is dimensionless, but not for thermodynamic entropy, which has units of J/K.

    But I just showed if we multiply an entropy expressed in J/K by the conversion factor 1.045 x 10^23 Shannon Bits / (J/K), you get an entropy expressed in Shannon bits, and wiki calls this “dimensionless entropy”.

    Example for 500 pure copper pennies weighing 3.11 grams with a standard molar entropy of 65.646 grams/mol.

    S_thermodynamic =

    500 * 33.2 Jolues/Kelvin/Mol * 3.11 grams 65.546 grams/ mol
    = 826.68 J/K
    = 826.68 J/K / (1.381x 10^-23 J/K) / .693147
    = 8.636 x 10^25 bits
    = 826.68 J/K / (1.381x 10^-23 J/K) = 5.98602 x 10^25 nats
    = 5.98602 x 10^25 nits

    I would not be able to do this conversion if J/K was in one dimension and Bits, Nits, and Nats were dimensionless. This conversion can be inferred from here:
    http://en.wikipedia.org/wiki/Boltzmann_constant

    Instead of binary bits for degrees of freedom ( S = log2 W) , we have nats or nits (S = ln W).

    I was merely pointing out, if one claims this is “dimensionless entropy”, the J/K entropy is actually dimensionless also even though J and K are separately dimensioned, J/K can be dimensionless, that is because K is expressible (through the conversion factor) in Joules/ (nats) where nats is:

    S = ln W versus S = log2 W

  5. Sal:

    But I just showed if we multiply an entropy expressed in J/K by the conversion factor 1.045 x 10^23 Shannon Bits / (J/K), you get an entropy expressed in Shannon bits, and wiki calls this “dimensionless entropy”.

    But your conversion factor has J/K right there in the denominator:

    1.045 x 10^23 Shannon Bits / (J/K)

    You’re making my point for me. If you have a thermodynamic entropy expressed in J/K, and the only way to convert it to a dimensionless quantity is to multiply it by a conversion factor with J/K in the denominator, then obviously J/K is not dimensionless.

    Physicists aren’t idiots, Sal. If J/K were actually dimensionless, they would have dropped the J/K long ago.

  6. You’re making my point for me. If you have a thermodynamic entropy expressed in J/K, and the only way to convert it to a dimensionless quantity is to multiply it by a conversion factor with J/K in the denominator, then obviously J/K is not dimensionless.

    That would only be true if Shannon Bits are dimensional in the same dimension as J/K. If you insist Shannon Bits are dimensionless, dividing a dimensionless quantity by a dimensioned quantity results in a dimensioned quantity. But since you say the final result in bits or nits is dimensionless, this implies J/K is also dimensionless.

    An acceptable alternative is to declare entropy has a dimension (as in degress of freedom being a physical dimension). Then J/K is dimensioned and so are bits, nats, or nits. But that is not however how wiki represents nats.

    I have no problem saying entropy is dimensioned or dimensionless as long as one is consistent before and after conversion of entropy expressed J/K to nits, nats, or bits.

  7. Sal,

    That would only be true if Shannon Bits are dimensional in the same dimension as J/K. If you insist Shannon Bits are dimensionless, dividing a dimensionless quantity by a dimensioned quantity results in a dimensioned quantity. But since you say the final result in bits or nits is dimensionless, this implies J/K is also dimensionless.

    Not at all. You’re getting confused about the direction of conversion. You multiply the thermodynamic entropy by your conversion factor to get the Shannon entropy, but you divide the Shannon entropy by your conversion factor to get the thermodynamic entropy.

    Let’s do the dimensional analysis in both directions:

    First, start with a thermodynamic entropy expressed in J/K. Multiply by your conversion factor of 1.045 x 10^23 / (J/K). The J/K terms cancel and you are left with a dimensionless number: the Shannon entropy.

    Now, start with a dimensionless Shannon entropy. Divide it by your conversion factor of 1.045 x 10^23 / (J/K). You are left with a thermodynamic entropy expressed in J/K.

    I don’t understand why this is so confusing for you.

    Here’s an analogous example. Suppose I have a bunch of copper ingots, each weighing 10 kg. I want to be able to convert back and forth between the total mass of the ingots, which is a dimensioned quantity, and the number of ingots, which is dimensionless.

    How do I do it? Easy. I set up a conversion factor of 1/(10 kg). Multiplying the total mass by the conversion factor will give me the number of ingots. Dividing the number of ingots by the conversion factor will give me the total mass.

    Let’s say I have 2000 kg worth of ingots. 2000 kg is a dimensioned quantity. Multiplying by the conversion factor of 1/(10 kg), I get an answer of 200 ingots. The kg units cancel, leaving a dimensionless number.

    Now let’s go in the opposite direction. Starting with 200 ingots, a dimensionless quantity, I divide by the conversion factor of 1/(10 kg). That gives me an answer of 2000 kg, a dimensioned quantity.

  8. stcordova,

    I suppose when the system randomizes the direction of the molecules, the usable energy goes down since there is no longer a very hot and very cold direction, and the Carnot efficiency increases with a temperature differential between cold and hot. This is actually a situation where one has a cold reservoir in one direction and a hot one in another!

    Depends. The box is perfectly insulated, and molecules initially randomly distributed within it. There’s either a molecule with a fixed amount of energy, or a space, at any point. But it’s clearly not an equilibrium position, which is why I chose it.

    The point, however, was to analogise. There is indeed the capacity to do work in this system, because there is the potential for equilibration. When you look at a single microstate, there is nothing unusual about it. But such a collection of microstates is highly improbable. However, it is not that improbability that does work, it is the interaction between molecules. Molecules shed their energy and change their directions, and gradually there is equilibration. The fan would die down. But there is no such capacity in an ‘informational entropy’ view. 500 coins all heads will drive a fan no better than 500 coins in any old order. Mike Elzinga used to bang on about this a lot. Microstate components interact, and there is force to contend with, not just simple kinetic energy vectors. It’s not Scrabble tiles, coins, cards, ASCII, etc; it’s dynamic systems.

  9. Allan,

    Thank you for response. I have no problem using more traditional notions of entropy for physics. The most traditional would be Clausius view of entropy and after that Boltzman:

    S = kb ln W

    using kb = 1 for natural units

    S = kb ln W = ln W

    They call that Shannon entropy. Whatever the differences, it seems like minutia at that point to me. Frankly if this was the only real connection between thermodynamics and information theory, it seems pretty inconsequential and certainly nothing IDists should crow about.

    The problem is characterizing W and that is challenging for both the traditional and Shannon form. I don’t think the information viewpoint can add or detract from insights about W.

    For the molecules moving in 1 direction and only 1 degree of freedom vs. the usual 3 degrees for a monoatomic gas, I think the system would have 3 times the temperature relative to it’s randomized equilibrium state since it has 3 times fewer microstates. But that is my guess. Characterizing W would be a question even for traditional statistical thermodynamics in that case.

    Molecules shed their energy and change their directions, and gradually there is equilibration. The fan would die down. But there is no such capacity in an ‘informational entropy’ view.

    In that case S = kb ln W will become 3 times (or some number) larger than the unidirectional case, since now there would be 3 degrees of freedom for each particle. This would imply W_final = W_initial^3

    I saw somewhere a quasi derivation of this, but I don’t have it handy. Feel free to correct if I mis-stated.

    Thus for the traditional view

    S_final = 3 x S_initial

    for the informational view

    S_final = 3 X S_initial

    where
    S_final = final entropy after reaching equilibrium
    S_initial = initial entropy of the system

    I presume then this would mean W_final = W_initial^3 or something like that. Whatever it is, I don’t see that the information viewpoint would be materially different except kb = 1 in calculating entropy, but they should all yield the same answer except for scaling differences as far as I can tell.

  10. Keiths,

    I’m saying J/K is dimensionless even though J is dimensioned and K is dimensioned.

    Uhfortunately the proof of this can’t be direct, because direct proof results in a circuitous conclusion.

    J = Joules

    K = Joules/ S

    where S is unfortunately expressed in J/K, so that doesn’t clarify immediately, but it does not stop us from making one useful inference. But lets back up a bit:

    T = Energy / Entropy, well not exactly since I’m not using partial derivatives, but I think that suffices for our purposes…

    If Entropy S dimensionless then Temperature can be expressed in Joules, in the same dimension as the numerator, but just scaled by the dimensionless number for entropy.

    If Entropy S is dimensioned then T is in another dimension than Joules.

    So we have two possible situations, dimensionless or dimensioned for J/K entropy. If we assume one or the other, that will imply the Shannon form must be dimensionless or dimensioned based on whether we assume J/K is dimensionless or dimensioned. If we are measuring temperature, I don’t think we’d want to be saying we are measuring fundamentally different things when in one case entropy S is in Shannon Nits and in another it is in SI units of J/K.

    An impossible situation is one form (like J/K) is dimensioned while Shannon is dimensionless or vice versa. That’s why I objected to this as a matter of principle:

    S = ln W is true for Shannon entropy, which is dimensionless, but not for thermodynamic entropy, which has units of J/K.

    It doesn’t matter that much to me whether you believe J/K is dimensionless or not, but whatever one decides, it will carry over to whether we say Shannon Bits are dimensionless or not. I opted to say J/K is dimensionless by inference from the wiki page on Boltzman’s constant.

    Thanks anyway for the discussion.

  11. sez sal cordove: “I’m saying J/K is dimensionless even though J is dimensioned and K is dimensioned. …

    J = Joules

    K = Joules/ S”
    Okay; you’ve got J, a quantity measured in units of Joules. You’ve got K, a quantity measured in units of Joules/S. How, exactly, do you divide a quantity measured in units of Joules (i.e., the J) by a quatity measured in units of [Joules/S] (i.e., the K) and not end up with a quantity measured in units of S? Exactly where does the S disappear, and how?

  12. cubist,

    S doesn’t disappear, the question is whether S is dimensionless or not. Wiki says if S is measured in Shannon bits, the number S is dimensionless, which means it is just acts like a scaling factor.

    No one here seems to object vigorously to the idea of thermodynamic entropy expressed in Shannon Nits (nats, or bits), is dimensionless. The issue is whether this necessarily means S stated in J/K must necessarily be dimensionless or dimensioned.

    The Wikipedia page on Boltzman’s constant says Shannon entropy is dimensionless.

    One could choose instead a rescaled dimensionless entropy:
    [equations]

    This is a rather more natural form; and this rescaled entropy exactly
    corresponds to Shannon’s subsequent information entropy.

    http://en.wikipedia.org/wiki/Boltzmann_constant

    FWIW, I don’t think the debate over whether it’s dimensioned or dimensionless will make any difference in calculations or experimental outcomes. It’s rather moot if it doesn’t affect the calculations or applications in the lab.

  13. Sal,

    I’m saying J/K is dimensionless even though J is dimensioned and K is dimensioned.

    Which makes no sense, given that the base units don’t cancel.

    Uhfortunately the proof of this can’t be direct, because direct proof results in a circuitous conclusion.

    A ‘circuitous conclusion’?

    J = Joules

    K = Joules/ S

    where S is unfortunately expressed in J/K, so that doesn’t clarify immediately,

    It does ‘clarify’ immediately, as cubist points out. Do the cancellation and you’re back to units of J/K.

    I’m having a hard time believing that you never learned dimensional analysis, Sal. Is that really true? I learned it in high school chemistry, and my school wasn’t known at all for the quality of its academic programs.

    T = Energy / Entropy, well not exactly since I’m not using partial derivatives, but I think that suffices for our purposes…

    You’re trying to redefine the kelvin. I keep telling you:

    In base units, J/K is (kg m^2)/(s^2 K). The only way to get cancellation is to redefine K in terms of energy.

    The world is not going to redefine the kelvin just to get you off the hook, Sal.

    If Entropy S dimensionless then Temperature can be expressed in Joules,

    The latter doesn’t follow from the former. Temperature can be expressed in joules either way. But if it’s expressed in joules, it isn’t expressed in kelvins.

    If Entropy S is dimensioned then T is in another dimension than Joules.

    Of course. In the SI system, the kelvin is the unit of temperature and entropy is expressed in joules/kelvin — J/K.

    So we have two possible situations, dimensionless or dimensioned for J/K entropy.

    No, we only have one possibility: dimensioned. The base units don’t cancel. Again, physicists aren’t idiots. They would have dropped the J/K long ago if J/K were dimensionless.

    An impossible situation is one form (like J/K) is dimensioned while Shannon is dimensionless or vice versa.

    It’s totally possible. It depends on the base units in your system.

    If the kelvin had been defined in terms of energy, entropy would be dimensionless. The kelvin wasn’t defined in terms of energy, so entropy has dimensions in the SI system.

    It doesn’t matter that much to me whether you believe J/K is dimensionless or not,

    No, but it bugs the crap out of you that a) you made a silly mistake and b) that an ID critic had to point it out to you.

    but whatever one decides, it will carry over to whether we say Shannon Bits are dimensionless or not.

    No. If W is the number of microstates, then ln W is always dimensionless. Thermodynamic entropy gets its dimensions from Boltzmann’s constant, not from ln W.

    I opted to say J/K is dimensionless by inference from the wiki page on Boltzman’s constant.

    Right, but you failed to realize that it depends on the system of units being used. In the SI system, Boltzmann’s constant is not dimensionless.

    Thanks anyway for the discussion.

    You’re welcome. I’m glad you learned something. Don’t let it stick in your craw for too long. 🙂

  14. Keiths wrote:

    In the SI system, Boltzmann’s constant is not dimensionless.

    So you concede its dimensionality is only an artifact of human convenience, which means it is somewhat illusory, it’s not really a real physical dimension since in natural God-made units kb = 1.

    As far as me making mistake, you better rethink this. I said:

    The SI unit (J/K) and the natural unit for entropy are logarithmic measures of the same number of microstates, yet Keiths swears one measurement measures actual physical dimensions while the other doesn’t.

    and you responded

    You just made that up. I haven’t said — much less “sworn” — anything of the sort.

    Knock it off, Sal.

    And then you got called on it because you really said:

    S = ln W is true for Shannon entropy, which is dimensionless, but not for thermodynamic entropy, which has units of J/K.

    That’s not true is it if J/K is dimensionless is it. Haha!

    You can’t equate entropy measures if one entropy is dimensionless and the other is dimensioned. That’s like saying 1 = 1 meter. Which is plain silly.

    You can however say 1 meter = 1000 millimeters because meters/millimeter equals the dimensionless number 1000.

    You’re the one showing some lack of insight about dimensions, not me.

    So you really think it’s legitimate to let A be dimensionless and B dimensioned? If so, you believe statements like the following ought to be permissible at least in principle:

    1000 = meter
    4000 = columb
    20 = kg
    .
    .
    200 kg = 1 columb

    Silliness wouldn’t you say?

    How about:
    1 / (1.381x 10^-23 ) = J/K

    or

    1 = (1.381x 10^-23) (J/K)

    or

    1 dimensionless nat = (1.381x 10^-23) (J/K)

    but a nat is dimensionless whereas J/K according to you is dimensioned. But if that is as you claim then, that is a silly equality, unless of course J/K is dimensionless. Hence J/K is dimensionless, QED.

    Recall for 500 pure copper pennies:

    S_thermodynamic =

    5.98602 x 10^25 nats = 826.68 J/K

    If we divde both sides by 826.68 and then multiply both sides by (1.381x 10^-23) we get, tada!

    1 nat = (1.381x 10^-23) (J/K)

    But such an equality is silly if J/K is an actual dimension. But it’s not silly if J/K is dimensionless, hence J/K is dimensionless. QED

  15. How deep will Sal dig his hole? Tune in tomorrow to find out.

    Meanwhile, ponder why Sal, who describes himself as “a mediocre student of science and engineering at best”, nevertheless refuses to accept corrections from folks who understand this stuff far better than he does.

  16. keiths,

    Since you are much less mediocre on issues of science than Sal, can you help us to understand better the theory of income inequality and google searching?

    I think it may well have something to do with entropy but I can not be sure, because I can’t read the entire paper. Can you explain if the paper goes into Shannon bits in its findings?

  17. stcordova,

    “natural God-made units”? LOL

    Hey sal, in what way is your incessant babbling about “natural God-made units”, “500 pure copper pennies”, “thermodynamic entropy expressed in Shannon Nits (nats, or bits)”, “J/K is dimensionless”, or anything else going to provide ANY evidence that your chosen, so-called ‘designer-creator-god’ (yhwh-yehoshua-holy-ghost) exists, that the universe is only 6,000 years old, that a “living dog” could be fossilized in 65 million year old rocks (Hey, wait a second, 65 million years is a lot more than 6,000 so how can there be 65 million year old rocks? LMAO), and that the bible is ‘the word of god’?

  18. Retraction on something I said that was really dopey:

    In that case S = kb ln W will become 3 times (or some number) larger than the unidirectional case, since now there would be 3 degrees of freedom for each particle. This would imply W_final = W_initial^3

    Particle degrees of freedom (translation, rotational, vibrational, ever) should not be equated or so easily tied to energy microstate degrees of freedom for the entire system. The relationship is more complicated I think than I’m making it. Apologies to the reader.

    See, it’s easy for me to say, “I made a mistake”, which is more than what Keiths is able to do. He thinks it’s perfectly appropriate to equate an entire dimension to a single dimensionless number.

    For example would I equate
    4000 to meters as in

    4000 = meters

    NO! not if meters are a dimension.

    or how about 4000 to columbs asi in

    4000 = columbs

    NO! not if columbs are a dimension.

    And if I did It such heretical illogical things, it would also allow me to do stuff like this

    meters = columbs

    But Keith’s thinks it’s OK to do something similar if J/K is a dimension

    1 = (1.381x 10^-23) (J/K)

    Where 1 is the dimensionless “nat” of entropy.

    But oh well, the only way this is permissible is if J/K is dimensionless. He can’t get it through his head.

    I’m going to hold this statement by Keiths in my trophy case:

    S = ln W is true for Shannon entropy, which is dimensionless, but not for thermodynamic entropy, which has units of J/K.

    just like the other dopey statement: he stands on:

    if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins,

  19. stcordova: if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins,

    Are all unique configurations equiprobable, Sal?

  20. Here’a simple thought experiment.

    Toss a coin 500 times and record the outcomes.

    Now offer a billion dollars to the person who can toss a coin 500 times and get the same outcome.

    Suddenly the series that happened without violating any laws of nature becomes nearly impossible. A seemingly trivial outcome suddenly becomes significant.

    This is a variation on retrospective astonishment. Assigning significance to the current situation, because the collective probabilities of all the events leading up to the present are astronomically low.

    Sal, the only possible argument against evolution is to demonstrate that alleles are impossible, that all point mutations are fatal.

    If, it is possible to have neutral and nearly neutral mutations, then change will happen, and there is no obstacle to the accumulation of change.

  21. Creodont2:
    phoodoo,

    If you hock your $10,000 watch you’ll be able to afford the paper.

    Except that it was keiths who presented the paper, so don’y you find it a bit unseemly that he now is unwilling to defend it?

    I mean not only is he distancing himself from it now, he won’t even say whether or not he has read what he is posting here.

  22. You have quite an active imagination, phoodoo.

    I’m not “distancing” myself from the paper. I’m waiting for you to identify a flaw in it. Phoodooesque declarations of incredulity don’t qualify.

    Get to work on your OP.

  23. phoodoo: Except that it was keiths who presented the paper, so don’y you find it a bit unseemly that he now is unwilling to defend it?

    I mean not only is he distancing himself from it now, he won’t even say whether or not he has read what he is posting here.

    Heh. I’m not Keiths’ biggest fan, but even I give him the credit of figuring that he actually has read whatever he is posting here.

    What happened to your decent christian behavior, phoodoo? Where did you lose it?

  24. Sal:

    just like the other dopey statement: he stands on:

    if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins,

    Whoa, tiger. You might want to concentrate on digging one hole at a time.

    Eigenstate’s comment was correct. Every specific sequence is equally likely, and all are consistent with the physics of fair coins.

  25. A comment I made at the time:

    No one familiar with Sal’s character will be surprised to learn that he shamelessly quotemines eigenstate in his new OP:

    SSDD: a 22 sigma event is consistent with the physics of fair coins?

    Here’s the full eigenstate quote. The parts that Sal omitted are in bold:

    Maybe that’s just sloppily written, but if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins, and as an instance of the ensemble of outcomes that make up any statistical distribution you want to review.

    That is, physics is just as plausibly the driver for “all heads” as ANY OTHER SPECIFIC OUTCOME.

    Eigenstate carefully put the key phrase in ALL CAPS so that Sal couldn’t miss it. Sal chose to omit it anyway. What eigenstate said is correct, of course, and Sal is wrong to challenge it, especially in such a dishonest way.

    All heads is no more improbable than any other specific sequence of results. Every specific sequence has a probability of 1 in 2^500.

    The point that Sal is making in the OP is a different one: that it’s more probable to get a sequence that contains some combination of 250 heads and 250 tails than it is to get all 500 heads. Eigenstate would not disagree, of course.

  26. Sal:

    I opted to say J/K is dimensionless by inference from the wiki page on Boltzman’s constant.

    keiths:

    Right, but you failed to realize that it depends on the system of units being used. In the SI system, Boltzmann’s constant is not dimensionless.

    Sal:

    So you concede its dimensionality is only an artifact of human convenience,

    It’s an artifact of history, not of ‘human convenience’. And no, I don’t ‘concede’ it — I’ve been trying to get you to see it since my very first comment:

    That’s not true. The Kelvin is an SI base unit, so J/K is not dimensionless.

    It could have been dimensionless, as Arieh Ben-Naim points out, if the temperature scale had been introduced after the atomic hypothesis was widely accepted, because then temperature could have been defined in terms of the average kinetic energy per molecule. In that case the Kelvin would not have been a base unit, J/K would have been dimensionless, and it would have made sense to define S as ln W, leaving Boltzmann’s constant out of the picture. History unfolded otherwise.

    Have you finally seen the light?

    You can’t equate entropy measures if one entropy is dimensionless and the other is dimensioned.

    I don’t equate them. You’re making stuff up again.

    1 dimensionless nat = (1.381x 10^-23) (J/K)

    but a nat is dimensionless whereas J/K according to you is dimensioned. But if that is as you claim then, that is a silly equality, unless of course J/K is dimensionless. Hence J/K is dimensionless, QED.

    The ‘silly equality’ is yours, not mine.

    What I say is that in the SI system, you can convert from a Shannon entropy to a thermodynamic entropy, or vice-versa, as long as both are referring to the same undistinguished microstates.

    But the conversion factor has units, as you yourself admitted.

    If the conversion factor has units, then of course a dimensionless quantity can be converted to a dimensioned one, and vice-versa.

    My copper ingot example shows exactly how it is done. Were you able to follow it?

  27. Every specific sequence has a probability of 1 in 2^500.

    I never implied otherwise, but groups of sequences with the same characteristic number of heads (0,1,2…..500) have differing multiplicities as evidenced by the binomial distribution, and hence outcomes corresponding to these groups will NOT have the same probability.

    I did say binomial distribution to emphasize outcomes were not about specific sequences but outcomes that have characteristic number of heads under the binomial distribution, since the binomial distribution applied to the coins doesn’t classify specific sequences but sequences with characteristics.

    Eigenstate insinuated I said something I didn’t actually say, and the end result he and Keiths gave me a trophy.

    if you have 500 flips of a fair coin that all come up heads…that is outcome is perfectly consistent with fair coins

    That statement doesn’t accord with common sense notions of “perfectly consistent”. Perfectly consistent means “consistent with expectation” in most people’s view, not some idiosyncratic use of the phrase “perfectly consistent”. 100% heads for 500 fair coin flips is not consistent with expectation.

    When you erect a strawman that claims “Sal said the sequences aren’t equiprobable”, at least knock it down it a way that doesn’t may you guys look like fools. Hahahah!

  28. You’re making stuff up again.

    Oh yeah? Look up botzmann’s constant on wiki and you see the relation of Clausius entropy with natural entropy, if you’re up to it, why don’t you show the relation of nats to J/K. It’s not very difficult calculus, unless that’s too hard for you. 🙂

  29. keiths:

    Every specific sequence has a probability of 1 in 2^500.

    Sal:

    I never implied otherwise…

    Sure you did, by disagreeing with eigenstate’s comment:

    Maybe that’s just sloppily written, but if you have 500 flips of a fair coin that all come up heads, given your qualification (“fair coin”), that is outcome is perfectly consistent with fair coins, and as an instance of the ensemble of outcomes that make up any statistical distribution you want to review.

    That is, physics is just as plausibly the driver for “all heads” as ANY OTHER SPECIFIC OUTCOME.

    He got it right, Sal.

    The only trophy you got was for “best misunderstanding of probability by a young-earth creationist”.

  30. Sal:

    if you’re up to it, why don’t you show the relation of nats to J/K. It’s not very difficult calculus, unless that’s too hard for you.

    It’s algebra, Sal. No wonder you’re so confused.

  31. keiths:

    I don’t equate them. You’re making stuff up again.

    Sal:

    Oh yeah? Look up botzmann’s constant on wiki and you see the relation of Clausius entropy with natural entropy…

    What does the Wikipedia article have to do with it? The question is whether I equate them. I don’t. You’re making stuff up.

  32. This seems like a rather pointless discussion.

    Thermodynamic temperature is defined as the ratio of increments of energy and entropy, 1/T = dS/dE. If one were to start from scratch and use the natural dimensionless measure for entropy S = logW then temperature would have the units of energy.

    Alas, the unit of temperature had been chosen before the current definition of temperature was adopted. We are therefore stuck with temperature measured in kelvins and energy in joules. Thus entropy has silly units of joules per kelvin.

    There is of course no difficulty of converting entropy from thermal units of joules per kelvin to bits. Just divide the entropy by the Boltzmann constant (and by ln2).

    What’s to argue about?

  33. olegt,

    What’s to argue about?

    Where there’s a Sal, there’s a way.

    If one were to start from scratch and use the natural dimensionless measure for entropy S = logW then temperature would have the units of energy.

    Alas, the unit of temperature had been chosen before the current definition of temperature was adopted. We are therefore stuck with temperature measured in kelvins and energy in joules. Thus entropy has silly units of joules per kelvin.

    That’s what I told Sal in my very first comment.

    He has this odd idea that if thermodynamic entropy is dimensionless in some measurement systems, it must be dimensionless in all of them. Therefore, by his faulty logic, J/K is dimensionless.

    He just doesn’t grok that the kelvin is a base unit in the SI system, and that it isn’t defined in terms of energy.

  34. He has this odd idea that if thermodynamic entropy is dimensionless in some measurement systems, it must be dimensionless in all of them. Therefore, by his faulty logic, J/K is dimensionless.

    Just because something is a unit of measure doesn’t make it a dimension.

    http://en.wikipedia.org/wiki/Dimensionless_quantity

    Even though a dimensionless quantity has no physical dimension associated with it, it can still have dimensionless units. To show the quantity being measured (for example mass fraction or mole fraction), it is sometimes helpful to use the same units in both the numerator and denominator (kg/kg or mol/mol). The quantity may also be given as a ratio of two different units that have the same dimension (for instance, light years over meters). This may be the case when calculating slopes in graphs, or when making unit conversions. Such notation does not indicate the presence of physical dimensions, and is purely a notational convention. Other common dimensionless units are % (= 0.01), ‰ (= 0.001), ppm (= 10−6), ppb (= 10−9), ppt (= 10−12), angle units (degrees, radians, grad), dalton and mole. Units of number such as the dozen and the gross are also dimensionless.

    Thermodynamic entropy is a scaled logarithmic count of thermodynamic microstates. If thermodynamic entropy is dimensionless in bits, nits, or nats — when I did the conversions above by dividing by Boltzman’s constant, then it seems reasonable to say it is dimensionless when even when the units are J/K. At issue is whether J/K is a dimensionless unit, and I said yes. Apparently you find the possibility totally unacceptable. I don’t care that much, but you’re insisting J/K is a dimensioned unit.

    I’ve said couple dopey things in this thread and have made retractions, but you’re unwilling to even see the possibility that just because there is a unit associated with something, it doesn’t necessarily mean it has a dimension.

    I’d don’t care that much except for the fact it was fun arguing. The numbers come out the same whatever you believe.

  35. This is pitiful, Sal. When are you going to stop digging?

    Dimensionless physical constant:

    In physics, a dimensionless physical constant, sometimes called fundamental physical constant, is a physical constant that is dimensionless – having no units attached, having a numerical value that is the same under all possible systems of units.

    Boltzmann’s constant is not dimensionless in the SI system. It has units attached, and its value depends on the system of units chosen.

    Since Boltzmann’s constant is not dimensionless, neither is entropy when expressed in joules per kelvin.

    J/K is not dimensionless. You got it wrong, and I know you don’t like that, but facts is facts.

  36. Keiths insists I’m making stuff up.

    So let me make this up. If I have hypothetically a system that has a change of thermodynamic entropy of say (1.381x 10^-23) (J/K), I claim the change of entropy can be expressed in nats and shannons as follows:

    1 nat = (1.381x 10^-23) (J/K)

    1 nat ~= 1.44 shannons = (1.381x 10^-23) (J/K)

    where nat and shannons are dimensionless units

    http://en.wikipedia.org/wiki/Nat_(unit)

    Do you have a problem with that Keiths?

  37. Sal:

    Keiths insists I’m making stuff up.

    Yes, because you’ve been shamelessly making up stuff about me. See this and this, for example.

    As for the rest of your comment, we’ve been through this already. I wrote:

    What I say is that in the SI system, you can convert from a Shannon entropy to a thermodynamic entropy, or vice-versa, as long as both are referring to the same undistinguished microstates.

    But the conversion factor has units, as you yourself admitted.

    If the conversion factor has units, then of course a dimensionless quantity can be converted to a dimensioned one, and vice-versa.

    My copper ingot example shows exactly how it is done. Were you able to follow it?

    Slow down, read my example, and think about it, Sal.

  38. Keiths,

    If I have hypothetically a system that has a change of thermodynamic entropy of say (1.381x 10^-23) (J/K), I claim the change of entropy can be expressed in shannons and nats as follows:

    1.44 shannons ~= 1 nat = (1.381x 10^-23) (J/K)

    Do you have a problem with that? 🙂

    Apparently this guy from talk origins doesn’t have much problem with my conversions:

    Shannon Information, Entropy, Uncertainty in Thermodynamics and ID

    But why the refusal on your part to offer a simple, “yes” or “no”? Do I sense the question makes you uncomfortable? 🙂

  39. Sal,

    Of course you can convert an entropy expressed in J/K to the same entropy expressed in nats, and vice-versa.

    Expressed in nats it is dimensionless, and expressed in J/K it is not.

    You admitted that yourself — if unintentionally — by offering this dimensioned conversion factor:

    1.045 x 10^23 Shannon Bits / (J/K)

    As I wrote then:

    You’re making my point for me. If you have a thermodynamic entropy expressed in J/K, and the only way to convert it to a dimensionless quantity is to multiply it by a conversion factor with J/K in the denominator, then obviously J/K is not dimensionless.

    Physicists aren’t idiots, Sal. If J/K were actually dimensionless, they would have dropped the J/K long ago.

    It’s kind of fascinating watching you deny the obvious. Keep digging.

  40. I asked if Keiths was Ok with this claim:

    1 nat = (1.381x 10^-23) (J/K)

    Keiths so charitably agreed to answer:

    Of course you can convert an entropy expressed in J/K to the same entropy expressed in nats, and vice-versa.

    Now look at this:

    1 nat = (1.381x 10^-23) (J/K)

    The left hand side (LHS) of the equation is dimensionless, but Keiths claims the right hand side (RHS) is not.

    Let me rework the equality this way

    1 nat / (1.381x 10^-23) = J/K

    The left hand side (LHS) of the equation is dimensionless, but Keiths claims the right hand side (RHS) is not.

    Is that a fair representation of your position, Keiths? Just trying to make sure we understand exactly what you think.

  41. Damn, Sal. You’re like a compulsive gambler who can’t stop betting until the credit cards are maxed out and the childrens’ tuition is gone.

    How many different ways do I have to demonstrate your error before you’ll accept it like a grownup?

    Your logic is ridiculous. Let’s apply it to the copper ingot example.

    By Cordova logic:

    200 ingots = 2000 kg

    200 ingots/2000 = kg

    The left hand side is dimensionless. Therefore, by Cordova logic, the kg is dimensionless.

    Can you see how ridiculous your argument is, Sal? Or do you actually believe the kilogram is dimensionless?

  42. Damn, Sal. You’re like a compulsive gambler who can’t stop betting until the credit cards are maxed out and the childrens’ tuition is gone.

    Actually, I was thrown out of casinos for being skilled, otherwise I’d have taken them for serious money. I was listed in the credits of the Holy Rollers Movie, you know, the Christian blackjack players who took the casinos for 3.5 million. 🙂

    But just to set the record straight for :

    1 nat / (1.381x 10^-23) = J/K

    you think the LHS is dimensionless and the RHS is not? Is that correct?

    I just want to make sure you’ve deluded yourself into thinking it’s quite all right to equate dimensionless quantities with those that aren’t dimensionless. 🙂

  43. Kilograms are dimensionless, eh, Sal?

    Another ignominious defeat at the hands of an ID critic. Put that in your trophy case, big guy.

  44. Kilograms are dimensionless, eh, Sal?

    No, and your ingot derivation has a silly premise that’s why it arrived at silly conclusion. 2 wrongs on your part don’t make a right. You obviously don’t see it.

    Since a nat is both dimensionless and normalized to natural units, nat = 1. Thus even more pointedly:

    1 / (1.381x 10^-23) = J/K

    Do you have a problem with that? If not, for the record would you say the LHS is dimensionless and the RHS is not? But if that’s too much for you consider this:

    1 nat / (1.381x 10^-23) = J/K

    For the record would you say the LHS is dimensionless and the RHS is not?

    Thanks in advance for a simple, “yes”, “no”, or “undecided” response.

  45. Sal:

    No, and your ingot derivation has a silly premise that’s why it arrived at silly conclusion.

    I arrived at a ridiculous conclusion because I used Cordova Logic. That’s why I don’t use Cordova Logic except in “kids, don’t do this” demonstrations.

    The arguments are exactly parallel and use the same bad Cordova Logic:

    1a. 1 nat = (1.381x 10^-23) (J/K)
    1b. 200 ingots = 2000 kg

    2a. 1 nat/(1.381x 10^-23) = J/K
    2b. 200 ingots/2000 = kg

    3a. The LHS is dimensionless. Therefore J/K is dimensionless.
    3b. The LHS is dimensionless. Therefore kg is dimensionless.

    Removing ‘nat’ from the LHS doesn’t help you, because ‘ingot’ can be removed by the same Cordova Logic.

  46. Keiths,

    1 nat / (1.381x 10^-23) = J/K

    For the record would you say the LHS is dimensionless and the RHS is not?

    Thanks in advance for a simple, “yes”, “no”, or “undecided” response. I’m just trying to make sure I’m representing your viewpoint correctly.

    Sal

  47. Sal,

    The source of your error is the sloppy use of equality in your argument.

    You casually write

    1 nat = (1.381x 10^-23) (J/K)

    …but that isn’t strictly correct. We both know that a dimensioned conversion factor is required, and you even supplied the conversion factor earlier.

    Here’s the corrected equation, along with the parallel equation for the ingot example:

    1 nat = (1.381x 10^-23 nat/(J/K))(J/K)
    200 ingots = (2000 kg)(1 ingot/10 kg)

    The J/K terms cancel, leaving a result in nats. So both the LHS and RHS are dimensionless. The kg terms cancel, leaving a result in ingots. So both the LHS and RHS are dimensionless.

    Going in the opposite direction,

    1 nat /(1.381x 10^-23 nat/(J/K)) = J/K
    200 ingots/(1 ingot/10 kg) = 2000 kg

    The nats cancel, so both the LHS and RHS are dimensioned. The ingots cancel, so both the LHS and RHS are dimensioned.

    And again, removing ‘nat’ and ‘ingot’ from the equations doesn’t make any difference. They’re still implicit.

    So the answer is no, I do not agree with your sloppy equation.

    It’s time to admit your mistake and move on, Sal. J/K is not dimensionless.

    You lost another one.

Leave a Reply