A Designed Object’s Entropy often Increases with Its Complexity

[This is an abridged version of a post at UD: A Designed Objects Entropy Must Increase for Its Design Complexity to Increase, Part 2. I post it under a different title at TSZ, because upon consideration, the new title should be above reproach. What I put forward should happily apply to man-made designs. A student recently wanted to challenge his professors regarding the 2nd law and evolution, and I pointed him to my essay. If that student is a creationist, at least I feel I did my job and made him understand science better than he would from most creationist literature. Hence, the rather torturous discussions at TSZ and UD had benefit in furthering this student’s understanding of science. If he is going to reject Darwinism, he should reject it for good reasons, not because of the 2nd law.]

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists. This essay is part II of a series that began with Part 1

The physicist Fred Hoyle famously said:

The chance that higher life forms might have emerged in this way is comparable to the chance that a tornado sweeping through a junkyard might assemble a Boeing 747 from the materials therein.

I agree with that assertion, but that conclusion can’t be formally derived from the 2nd law of thermodynamics (at least those forms of the 2nd law that are stated in many physics and engineering text books and used in the majority of scientific and engineering journals). The 2nd law is generally expressed in 2 forms:

2nd Law of Thermodynamics (THE CLAUSIUS POSTULATE)
No cyclic process is possible whose sole outcome is transfer of heat from a cooler body to a hotter body

or equivalently

2nd Law of Thermodynamics (THE KELVIN PLANCK POSTULATE)

No cyclic process is possible whose sole outcome is extraction of heat from a single source maintained at constant temperature and its complete conversion into mechanical work

In Part 1, I explored the Shannon entropy of 500 coins. If the coins are made of copper or some other metal, the thermodynamic entropy can be calculated. But let’s have a little fun, how about the thermodynamic entropy of a 747? [Credit Mike Elzinga for the original idea, but I’m adding my own twist]

The first step is to determine about how much matter we are dealing with. From the manufacturer’s website:

A 747-400 consists of 147,000 pounds (66,150 kg) of high-strength aluminum.

747 Fun Facts

Next we find out the the standard molar entropy of Aluminum (symbol Al). From Enthalpy Entropy and Gibbs we find that the standard entropy of aluminum at 25 Celcius at 1 atmosphere is 28.3 Joules/Kelvin/Mole.

Thus a 747’s thermodynamic entropy based on the aluminum alone is:

$S_{747}=\text{Entropy of 747}\approx$

$(\text {Mass of Al in 747} )(\frac{1 } { \text {atomic weight Al}} )(\frac {1}{\text {Molar Mass Constant}})(\text{Specific Entropy of Al})$

$=66,150kg (\frac{1000 g}{kg})(\frac{1}{26.981})(\frac{mol}{g})(28.3\frac{Joule}{^\circ K mol})$

$=6.94\times 10^7 \text{ } \frac{Joule}{^\circ K}$

Suppose now that a tornado runs into 747 and tears of pieces of the wings, tail, and engines such that the weight of aluminum in what’s left of the 747 is now only 50,000 kg. Using the same sort of calculation, the entropy of the broken and disordered 747 is:

$S_{BROKEN747}=5.24\times 10^7 \text{ } \frac{Joule}{^\circ K}$

Hence the tornado lowers the entropy of the 747 by disordering and removing vital parts!

And even supposing we recovered all the missing parts such that we have the original weight of the 747, the entropy calculation has nothing to say about the functionality of the 747. Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.

Perhaps an even more pointed criticism in light of the above calculations is that increasing mass in general will increase entropy (all other things being equal). Thus as a system becomes more complex, on average it will have more thermodynamic entropy. For example a simple empty soda can weighing 14 grams (using a similar calculation) has a thermodynamic entropy of 14.68 J/K which implies a complex 747 has 4.7 million times the thermodynamic entropy of a simple soda can. A complex biological organism like an Albatross has more thermodynamic entropy than a handful of dirt. Worse, when the Albatross dies, it loses body heat and mass, and hence its thermodynamic entropy goes down after it dies!

So the major point of Part II is that a designed object’s thermodynamic entropy often increases with the increasing complexity of the design for the simple reason that it has more parts and hence more mass. And as was shown in part 1, the Shannon entropy also tends to increase with the complexity of the design. Hence, at least two notions of entropy (Shannon and thermodynamic) can increase with increased complexity of a design (be it man-made design, evolution made design, or ….)

From: Boltzmann

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.” (Final paragraph of #87, p. 443.)

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

42 thoughts on “A Designed Object’s Entropy often Increases with Its Complexity”

1. I’m not able to make sense of this.

I sometimes point out to people that a sand dune is complex. It would be very difficult to assemble grains of sand of exactly the right shapes, and put them together in exactly the right relationship to make that sand dune. But people typically respond “No, it is not at all complex. It is just a pile of sand.”

My point — we have no good definition of “complex” that we can use here. So what is being said?

2. Here’s Sal’s point:

In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it, contrary to many intuitions among creationists and IDists.

3. In other words, Sal is urging his fellow travelers: “Don’t be a Sewell.”

4. “A Designed Object’s Entropy” suggests you can measure the entropy of any object. Is that so? There is no way to measure the amount of design or complexity of an entity, is there? Complexity seems a slippery customer.

5. One could start by measuring the entropy of Lenski’s bugs before and after.

6. petrushka:
One could start by measuring the entropy of Lenski’s bugs before and after

Do I detect a hint of irony? What would be the methodology?

7. While I was googling complexity, Adrian Bejan’s Constructal Law cropped up. I know Gregory posted a thread on it a while ago but I can’t seem to find it. (Authorship is not reliably attributed and I can’t spot it by title). If Gregory is looking in, maybe he would like to comment on whether he thinks Bejan’s law has any bearing on this thread.

8. 21st century Alchemy

Does anyone know the credentials of this blogger? Surely he can not be committed to a scientific understanding of physical reality..

On reading the above I immediately remembered the comedian on The Smothers Brothers, Prof Erwin Corey. Prof Corey used a lab coat and science jargon to be funny. In contrast, the above uses the same theatric props seemingly in attempt to obfuscate and or hypnotize the gullible.

Please inform me if I am wrong.

9. “Does anyone know the credentials of this blogger”

I have no credentials, save being a trouble maker.

10. 21st century Alchemy

Does anyone know the credentials of this blogger? Surely he can not be committed to a scientific understanding of physical reality..

On reading the above I immediately remembered the comedian on The Smothers Brothers, Prof Erwin Corey. Prof Corey used a lab coat and science jargon to be funny. In contrast, the above uses the same theatric props seemingly in attempt to obfuscate and or hypnotize the gullible.

Please inform me if I am wrong.

You got it exactly right.

We have all read Sal’s stuff before. There is nothing remotely of value here; except perhaps an ongoing reminder of ID/creationist tactics to gain a free ride and the appearance of “legitimacy” on the back of experts.

Its sole purpose is to keep the public illusion going that there is an “intense debate” taking place in the scientific community about ID/creationism.

This thread is simply a rehash of sophomoric crap that Sal posted over on the UD website. He claims he took a thermodynamics and statistical mechanics course but then he failed a simple concept test about entropy here.

Sal is practicing an old tactic – invented by Young Earth Creationists Duane Gish and Henry Morris – of pretending to be able to argue with experts. It has become extremely stale.

I thought Prof. Erwin Corey was really funny. 🙂

11. ” He claims he took a thermodynamics and statistical mechanics course but then he failed a simple concept test about entropy here.”

Actually I was taking the course while I was simultaneously trying to understand what you were trying to teach me. Though your example wasn’t on any homework assignment or quiz or test, it is always good to practice what I’m learning in class. Your concept test was a good supplement to my class work…

I’ll accept for the sake of argument that I’m as stupid and uniformed and uneducated as you say. Given that, special thanks to you for helping me do better in my studies.

12. Stupid, uniformed and uneducated is one thing. That’s how everyone starts out.

The difference is what do you do about it?

Sticking it out when it gets uncomfortable would be a good start. There are many unanswered questions on your previous threads you seem to have missed. And that’s a charitable reading as I know how fond you are of those.

13. “Sticking it out when it gets uncomfortable would be a good start. There are many unanswered questions on your previous threads you seem to have missed. And that’s a charitable reading as I know how fond you are of those.”

I view it a sticking it out when it’s pointless. I stuck it out on the Thermodynamics Threads when it was very uncomfortable, when I had to admit I didn’t understand, when I had to admit I was wrong about a number of things, etc. But there was a point to it all, as my essay above reflects. The point is giving good science to creationists such as what is written above. If a Darwinist wrote that, instead of me, I’d suspect you all would be chiming a different tune…

You’d might be saying, “well done Sal for showing what idiots creationists are.” But since I’m not a Darwinist, not one of you can bring your self to say, “Sal’s claim is generally correct.”

Thanks anyway to TSZ authors and commenters for helping me put together my essay. Thanks for the criticisms and corrections as that essay reflects those criticisms and corrections.

14. Sal,
“It’s not even wrong.” Wolfgang Pauli

If you are studying science, you have a chance at understanding. If you are studying at a bible college the chance is slim. But if you are at a bible college your lab coat and jargon will be well received. You can even make a living on the ignorant. Is this your desire?

15. Sal

The point is giving good science to creationists such as what is written above.If a Darwinist wrote that, instead of me, I’d suspect you all would be chiming a different tune…

You may have something of a point. I agree that ‘order/disorder’ is a pretty rubbish way of getting across the concept of entropy. Systems follow the path to a least-energy configuration, and shed energy in doing so. If (for example) the entire universe were to collapse into a single mass, it is not clear how the black hole + released potential energy is more ‘disordered’ than the pre-collapse separated masses. The energy is spread out, but the masses aren’t.

But I shouldn’t get too touchy about being picked on. When I made a comment that appeared to indicate that we were at a central point in the universe, I was picked up on it. When Lizzie suggested that the constancy of c for all observers meant that redshift causes on light could not include the Doppler Shift, she was picked up on it. If anyone attempts to make an argument based on Hoyle’s reasoning, I’ll be onto it; say anything at Sandwalk that Larry Moran disagrees with and he’ll be onto it (and he happens to favour ‘metabolism first’, which I think has some issues as I’ve attempted to discuss with you).

People pounce on science they disagree with, nobody gives anyone a free pass because they might happen to be fellow ‘materialists’.

16. ” He claims he took a thermodynamics and statistical mechanics course but then he failed a simple concept test about entropy here.”

Actually I was taking the course while I was simultaneously trying to understand what you were trying to teach me. Though your example wasn’t on any homework assignment or quiz or test, it is always good to practice what I’m learning in class. Your concept test was a good supplement to my class work…

I’ll accept for the sake of argument that I’m as stupid and uniformed and uneducated as you say. Given that, special thanks to you for helping me do better in my studies.

Sal’s previous thread on this stuff started on July 4, 2012.

Here is what Sal was “arguing” on that thread back in August 20, 2012.

For the record, the professor who tried to teach me this over the summer did not use the distinction of Thermal and Configurational entropy, but since the term has popped up in relation to Walter Bradley’s work (and that of other materials scientists like Gaskill) I thought I would explore it. Also for the record, I was not taught entropy in terms of order and disorder, but simply in terms of the logarithm of microstates.

The professor reviewed the Liouville theorem and it was in the class text (Pathria and Beale chapter 2). My misunderstanding in this discussion are surely not his fault, or that of the class text, but the class is for engineering students with limited physics background.

I was the only one with some background in Classical Mechanics, General Relativity, and QM. The class was 12 weeks, and most (Electrical and Computer Engineers) started with no knowledge of classical thermodynamics. The final lecture was on the Bekenstein bound and Hawking radiation, the Tollman-Oppenheimer model, but he decided not to put that on our take home final since it was obviously above our heads. He just skimmed over the topics for fun.

At this point Sal had bollixed up, and had not completed, the concept test for entropy; but instead proceeded to pretend he could “disagree” with experts. The “discussion” continued with a Sal’s misconceptions about the Sakur-Tetrode equation. The entropy stuff was dropped (surprise!).

Waving credentials and flunking basic concept tests – even after the final exam – is not impressive.

We still haven’t seen Sal address that high school level calculation that scales up the charge to mass ratios of protons and electrons to kilogram-sized masses separated by distances on the order of meters and then calculates the energies of interaction in units of joules and megatons of TNT. We still haven’t seen Sal fold in the rules of quantum mechanics and describe how such masses would behave.

And we certainly haven’t seen Sal justify, in the light of these simple high school level calculations, the ID/creationist tactic of using ideal gases of inert objects as stand-ins for the behaviors of atoms and molecules.

This is what YEC culture does to the brain.

This ID/creationist bullshit has been going on for nearly 50 years despite feedback from the science community. If the honest feedback from this blog is too tough for ID/creationists, that is not our fault. They tend to take gratuitous offense at anything their enemies say. It is a political sympathy ploy; not a spur to get off their persecution complexes and to start learning.

17. stcordova:
“Does anyone know the credentials of this blogger”

I have no credentials, save being a trouble maker.

Nobody needs credentials to post here. The great thing about the internet is because nobody for sure knows who anyone is (or need not know), arguments must be judged on their merits, not on the basis of the authority of the poster.

That’s why the rules here ask everyone to address the post not the poster.

18. “We still haven’t seen Sal address that high school level calculation that scales up the charge to mass ratios of protons and electrons to kilogram-sized masses separated by distances on the order of meters and then calculates the energies of interaction in units of joules and megatons of TNT. We still haven’t seen Sal fold in the rules of quantum mechanics and describe how such masses would behave”

That’s true. I honestly didn’t understand your question. I don’t understand a lot of things, that’s why I hang around you to show me the way.

I eventually got through your concept test. You were nice to me for a few minutes thereafter.

I couldn’t tell whether you accepted or rejected the Sakur-Tetrode equation. I don’t comprehend your resistance to the notion of increasing volume to increasing entropy since that seems standard in many textbooks like Pathria and Beale. I figured it wasn’t worth another shouting match since Sakur-Tetrode is irrelevant to ID.

I’m not a scientist, I don’t plan on being one. I have the greatest admiration for those that are scientists even the ones who are nasty to me like Mike. I took the classes so I could have a better understanding of things, and if the best result was the above essay, it’s still better than nothing, certainly better than most treatments of the 2nd law in 99% of creationist literature.

So what if I’m all the things Mike says, I don’t see any other creationists willing to come here and get abused like I have. I’m willing to do it if I feel I can learn something of value. The essay above is an example.

The take away I’m getting from your all’s comments is that there is nothing substantially erroneous about the essay. It’s material that can be passed onto creationists without fear of damaging their understanding of science.

The other stuff, well, I expected as much. Thanks anyway.

19. “but instead proceeded to pretend he could “disagree” with experts”

For the record, I wasn’t disagreeing with experts, especially OlegT, I was expressing what I didn’t understand. I will not even presume to disagree with OlegT on matters of physics, he’s a professor at a top school in the world.

20. And would you presume to disagree with Oleg’s peers in the biology related faculties?

21. “We still haven’t seen Sal address that high school level calculation that scales up the charge to mass ratios of protons and electrons to kilogram-sized masses separated by distances on the order of meters and then calculates the energies of interaction in units of joules and megatons of TNT. We still haven’t seen Sal fold in the rules of quantum mechanics and describe how such masses would behave”

That’s true. I honestly didn’t understand your question. I don’t understand a lot of things, that’s why I hang around you to show me the way.

Well; it looks as though you have some work to do. You will have to get hold of a beginning physics textbook.

You didn’t appear to notice the rest of the exercise that goes with it.

I’m not a scientist, I don’t plan on being one. I have the greatest admiration for those that are scientists even the ones who are nasty to me like Mike.

If you think that people who don’t put up with bullshit are “nasty,” you shouldn’t try to get away with dumping so much bullshit on people and then walking away from it. You don’t appear to have learned that lesson yet.

People notice; whether they are blunt about it or not. Many people probably take the default position of being polite if they haven’t had the experience of something like 50 years of watching ID/creationist behavior.

As far as “showing you the way” is concerned; I think I will stick with my general policy of observing. It’s not an ego trip for me.

But you have to deal with some unresolved questions from others about assertions you have posted here; and you have avoided addressing them. Don’t abuse Elizabeth’s gracious hospitality.

22. Sal – didn’t you want expresses that you need your faith to be true because you helped send some missionaries off (to an unfortunate end) at one point and that’s how you made peace with it? Please forgive mangling or misremembering..

23. Richard,

Here’s what you’re thinking of:

I started getting interested in ID in 2001 when my father was terminally ill and I was searching for meaning in life. There were also future missionaries from my churches and Bible studies who were risking their lives for their faith. It bothered my conscience that if the Bible were false, I was merely encouraging them toward their doom. One of the missionaries was Heather Mercer who became world famous in 2001 when US Army rangers rescued her from the Taliban. Thus, I had to be assured that ID was probably true so I could sleep at night, for their sake. If ID were false, the moral thing to do would be to discourage them from being missionaries.

24. stcordova:

I don’t see any other creationists willing to come here and get abused like I have

Abuse? Perhaps you could give, and link to, an example of this “abuse”?

25. I would be grateful if people could address Sal’s post on its intrinsic merits or otherwise, rather than bring in other things Sal may or may not have written elsewhere and at other times.

I know it is hard to put down the baggage, but please do.

Thanks.

26. Sal, while Shannon entropy and thermodynamic entropy are closely related mathematically, they are very different concepts. In some ways they are diametrically opposed.

At least I think so: Mike will no doubt correct me if I am wrong.

If a physical system has low thermodynamic entropy that means that there are large number of possible states available to the system. That means that any one state is low probability, and that the system as a whole therefore has high Shannon Entropy (lots of bits).

However, if a physical system has high thermodynamic entropy, that means there are a much smaller number of possible states available to the system. That means that the probability of any one is much higher, and the system therefore has low Shannon Entropy (small number of bits).

So as thermodynamic entropy increases, the amount of Shannon Entropy reduces. We could also say therefore that information tends to decay over time (which is true).

However, by putting more energy into a system from outside, its thermodynamic entropy can be reduced, and and its Shannon Entropy increased In other words, the flow of energy into a system increases the amount of information, where information is defined as Shannon Entropy.

Therefore biological organisms do not violate the second law at all. We know that there is an outside energy source (the sun) and that this results in local thermodynamic entropy decrease. As a result, we can also have local information increase.

So Granville Sewell has it absolutely backwards when he says that Intelligence is necessary to put Information into the system so that thermodynamic entropy can reduce. What is happening is that because Energy enters the system, thermodynamic entropy is reduced and Shannon Information increased. That Information might include intelligent agents themselves, but nothing in the 2nd law tells us that the Intelligent agent is the cause, rather than the result, of that thermodynamic entropy reduction.

27. I’d stll like to see someone calculate the change in entropy in the Lenski experiment.

28. ” Don’t abuse Elizabeth’s gracious hospitality.”

Dr. Liddle,

If you don’t want me posting here anymore I won’t. I think I’ve been posting in the spirit of what TSZ is about, feel free to tell me if anything I write anything I say is in violation of the TSZ mission.

In the above abridged version of my 2 UD essays at TSZ, I haven’t seen any technical objection to the abridged version. I avoided the Sakur-Tetrode equation in this essay because Mike insists, “If it’s not about energy, it’s not about entropy” contrary to my textbooks (like Pathria and Beale) which admit position in counting microstates. I didn’t think it was worth squabbling about at TSZ. He’s suggesting its a ID/creationist argument, but really that’s Mike vs. certain textbooks like Pathria and Beale. And even Wikipedia:

In classical statistical mechanics, the number of microstates is actually uncountably infinite, since the properties of classical systems are continuous. For example, a microstate of a classical ideal gas is specified by the positions and momenta of all the atoms, which range continuously over the real numbers. If we want to define Ω, we have to come up with a method of grouping the microstates together to obtain a countable set. This procedure is known as coarse graining. In the case of the ideal gas, we count two states of an atom as the “same” state if their positions and momenta are within δx and δp of each other. Since the values of δx and δp can be chosen arbitrarily, the entropy is not uniquely defined. It is defined only up to an additive constant. (As we will see, the thermodynamic definition of entropy is also defined only up to a constant.)

Mike can go fume with Wikipedia in that case, they said things probably more disagreeable to him than my mere referencing of Sakur-Tetrode

Regarding Shannon and Boltzman, the formulas are obviously similar

for Shannon:

Information = log2( omega )

for Boltzmann

Entropy = k log2 (omega)

Shannon even referenced Boltzmann by name in Shannon’s famous paper on communication theory relating maximum possible bandwidth to signal to noise ratios (the whole “bit” thing is that gets so much attention in ID discussion was actually a minor point of Shannon’s paper).

So what is the difference? Because Shannon dealt with communication, it meant different objects and counting of states were done by what ever criterion the observer (or engineers) chose to make significant in order to convey information. A coin, for example, is a conceptual entity, nothing in physics demands that we regard a lump of metal as a coin with heads or tails. But if we choose to, we can say ” coins are the objects, and the heads and tails of the coins define the microstates”

Now, if one wants to take Shannon’s ideas and say, “sets of atoms are the objects and the atom’s position and momenta define the microstates”, then we see there isn’t any difference at the atomic level between Shannon and Boltzman save the use of Boltzmann’s constant.

That issue relates to my essay in Part I at UD. But as far as the technical aspects of what I laid out in the above abridged version, I’ve still not heard technical objections to what has been said. I don’t see why there should be many technical objections, because the essay was basically composed of things that Darwinists at TSZ have argued for.

I provided the abridged essay to reduce arguments about Sakur-Tetrode, position considerations in counting microstates, and the Shannon discussion. But since I referenced links to my original essays, by implication, I draw those into the discussion as well, but they were not the focus of the abridged version.

Mike will keep insisting “if its not about energy its not about entropy”. I’ll keep saying, his claim doesn’t agree with Wikipedia or textbook physics, and hence the mainstream.

As far as Mike saying I failed the concept test, I only missed the term k! in one part, he want to flunk me for that, fine, but he can’t deprive me of what I learned.

Outside of the concept test I messed up in my understanding of the Liouville theorem, no question about that.

I posted my response to Mike’s concept test in :

Concept test attempt 1: Basic Statistical Mechanics

29. stcordova: If you don’t want me posting here anymore I won’t. I think I’ve been posting in the spirit of what TSZ is about, feel free to tell me if anything I write anything I say is in violation of the TSZ mission.

You are fine Sal, just carry on. There are some posts sailing near the guano wind, but I’m trying to keep a light touch right now.

30. A BEAUTIFUL THING

So, picture someone at a park picnic table, open laptop in front of them, putting the final touches on a paper which is the work of a lifetime and sure to win The Prize. The paper is done and just as you are about to save it a child’s flying toy goes under the table and you bend to help get it. At that moment someone jumps up on the table and while invoking the illustrious names of 19th century physicists, squats and evacuates their distal alimentary reservoir on the keyboard sending your unsaved paper and the information therein to electronic never never land. The Beautiful Thing is the writer then gives a detailed presentation of the energy changes involved turning a piping hot bowl of Boeuf bourguignon into somewhat cooler cloacal throughput INSTEAD of proclaiming the anarchist for what they no doubt are!

I stand and remove my hat.

31. Sal:

Hence, the 2nd law, which inspired the notion of thermodynamic entropy has little to say about the design and evolution of the aircraft, and by way of extension it has little to say about the emergence of life on planet earth.

This, I agree with. Granville Sewell is indeed incorrect.

So the major point of Part II is that a designed object’s thermodynamic entropy often increases with the increasing complexity of the design for the simple reason that it has more parts and hence more mass. And as was shown in part 1, the Shannon entropy also tends to increase with the complexity of the design. Hence, at least two notions of entropy (Shannon and thermodynamic) can increase with increased complexity of a design (be it man-made design, evolution made design, or ….)

Sal, do you really think there is a positive correlation between complexity and mass? Is Michelangelo’s David more complex than the block it was carved from? Does have greater mass? Is a tiny iphone more complex than the house brick that was my first phone? Does it have more mass? Is a living cell more complex than a watch? Does it have more mass? Do suspension bridges have more mass that masonry bridges? Are they more complex to design? Do they have more mass?

And if there is no positive correlation between mass and design complexity, then why should there be any positive correlation between thermodynamic entropy and design complexity, if you are simply assuming that thermodynamic entropy scales with mass (which seems odd to me! Don’t physicists prorate?)

What we can say is that to design and fabricate a complex thing, people do work and thus increase thermodynamic entropy. What they make doesn’t usually contain any less entropy than the ingredients they put in there, although it can (when we make cement, for instance). But cement isn’t terribly complex. On the other hand, a computer is, but it probably has more entropy, not less, than the things that went into it, at least I hope it doesn’t have too little, because I want the components to be reasonably inert and last a long time.

I do think this whole entropy/design thing is a dead duck!

32. Sal, while Shannon entropy and thermodynamic entropy are closely related mathematically, they are very different concepts. In some ways they are diametrically opposed.

The mathematical expressions are identical. The difference is that thermodynamic entropy is about energy and the second law; and that makes a huge difference. A string of ASCII characters can be assigned Shannon entropy; but it has no thermodynamic entropy.

Sal continues to protest; but he cannot grasp that fact as long as he cannot do that elementary high school level calculation and other similar elementary physics calculations.

The implications of that simple calculation are not trivial. Students in introductory physics are often given this calculation in order to get them to appreciate the differences between the gravitational and electrical energies of interaction. Understanding interactions is crucial in later physics courses. Neither Sal nor Sewell has this most basic understanding.

I’ve set the task for Sal. It is not a difficult calculation; most high school physics/chemistry students can do it. The fact that Sal doesn’t even understand the question is highly significant; especially given his claims that he has studied “advanced” topics in physics. I’ve seen this problem with ID/creationist “arguments” many times before. There is nothing more I need to repeat or add. I will just observe.

33. if you are simply assuming that thermodynamic entropy scales with mass (which seems odd to me!Don’t physicists prorate?)

Physicists distinguish between extensive properties, which scale with the size of the system, and intensive properties, which don’t. Entropy is an extensive property, and will be correlated with the mass of a system.

34. It seems to me you have snipped out the key part of the quotation.

And if there is no positive correlation between mass and design complexity, then why should there be any positive correlation between thermodynamic entropy and design complexity,

This is somewhat confusing, but I don’t think Lizzie was questioning the correlation between mass and entropy, so much as she was questioning the correlation between mass and complexity.

35. Yup. That was my point. There’s certainly a correlation between Shannon Entropy and thermodynamic entropy, but it’s negative.

So if complexity is defined as Shannon Entropy, Sal is wrong by 180 degrees. But if Complexity is something like Design, I don’t think there’s any correlation at all, because Design seems to me utterly uncorrelated with mass. We design things by taking stuff away at least as often as by adding stuff.

In fact, maybe there’s a slight negative correlation there too, as lightness is usually a design goal.

But thanks, Steve, for the clarification. I was surprised by that.

36. In order for a biological system to have more biological complexity, it often requires a substantial increase in thermodynamic entropy, not a reduction of it…

I still haven’t seen any demonstration of this. Perhaps Sal could demonstrate his thesis by calculating the thermodynamic entropy of an amoeba and a human zygote and comparing the results.

37. “Sal, do you really think there is a positive correlation between complexity and mass? Is ”

Not always, hence I revised the original title of my post. The word “often” is good enough.

Positive correlation means on average this is true, which might be impossible to demonstrate, and in the end not important, just as long as there exist pointed counter examples like 747s versus a small pile of junked coke cans, the creationist insistence that more entropy necessarily means more disorganization is false. The 747 is an example.

38. “So if complexity is defined as Shannon Entropy, Sal is wrong by 180 degrees.

Complexity (as in algorithmic or Kolmogorov complexity) is not the same as Shannon entropy, hence sal is not wrong by 180 degrees.

Complexity (as in CSI) is defined as Shannon entropy, but CSI complexity is not algorithmic or Kolmogorov complexity, hence sal is not wrong by 180 degrees. 🙂

This site uses Akismet to reduce spam. Learn how your comment data is processed.