The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
Textbook authors have been discussing this issue for decades. Most of us understand that there’s a statistical mechanistic definition of entropy and we understand that the simple metaphor of disorder/randomness is not perfect.
However, we have all decided that it is not our place to teach thermodynamics in a biochemistry textbook. We have also decided that the simple metaphor will suffice for getting across the basic concepts we want to teach.
I’ve talked about this with Cox and Nelson (the authors of the Lehninger textbook) and with Don and Judy Voet (the authors of another popular textbook). Sal may not like our decision but it was a very deliberate decision to dumb down a complex subject that doesn’t belong in biochemistry textbooks. It’s not because we don’t know any better.
There are a dozen such issues in my textbook … cases where difficult terms are deliberately over-simplified. On the other hand, there are some terms in my book that I refused to simplify even if other authors did so.
Larry Moran,
Professor Moran
I am delighted you chimed in! Frankly, Sal’s inchoate efforts at cutting and pasting are never worth the effort to unscramble.
I am hoping you can correct some naïve misconceptions on my part in order to make me a better teacher in the classroom.
I ask my student to imagine a bathtub full of water and then to imagine tossing in a 100 cm3 pebble. Of course, the level of the bathtub goes up a bit as the volume increases.
Now imagine a different scenario: instead of tossing in a pebble, toss instead 100 mL of ink into one corner of the bathtub. What happens?
Ans: the level of the bathtub goes up as before & the ink will gradually “spread”.
The kinetic theory of matter (particle theory) says that all matter consists of many, very small particles which are constantly and randomly moving or in a continual state of motion.
Emphasis on “RANDOM”.
If ink particles are moving randomly, then after a period of time the ink particles will become more randomly distributed than they were at first. I have always told my students that another way of saying that in “science-talk” is that “Entropy” increases. It is sometimes useful to think of an increase in Entropy as an increase in “randomness” or “disorder”.
What am I getting wrong?
I go onto sketch a quick outline of the Second Law. Energy will not flow spontaneously from a low temperature object to a higher temperature object. Similarly, solutes will not diffuse spontaneously from lower concentration to greater concentration.
I clarify that in class by suggesting to my students that to understand this concept we must return to the example of ink in the bathtub. The likelihood that all the diffused ink particles in the bathtub, as a result of their random motions, will return back to where they started is equivalent to the likelihood that nearly all the coins in a set of 1023 tosses will land on heads. It’s very, VERY unlikely! If you tossed that many coins over and over again for the lifetime of the universe (14 billion years) the odds that you would see all heads is still minuscule — totally ignorable. This extremely low probability is what transforms a “probability statement” into a “physical law.” Theoretically, it is not impossible to contravene the Second Law of Thermodynamics, just that the probability of doing so is so infinitesimally small, we can ignore it.
I hope I am not messing up and confusing my students.
I go on to explain that Potential Energy can be stored as a concentration gradient across membranes which represents “potential energy” that potentially (ahem) can do “work”. Just as the osmotic lifting of a column of water during Osmosis “Entropy-driven”, so too Chemiosmotic coupling (that couples the electron transport chain to ATP synthesis) is also Entropy-driven.
I guess my real question is: why can we not consider an increase in entropy as I have described it so far, an increase in randomness and disorder?
I thank you in advance for your patience and your indulgence.
Best regards
I don’t think anyone (here?) is singling out creationists for this, it is technically wrong whoever does it. I’ve seen others conflate the two, I’ve had this misconception myself in the past. It is a common misconception.
Dr. Moran,
Greatest respect for you and your discipline and your work. I’m studying biochemistry right now at the FAES grad school at the NIH part-time and in the evening. Lehninger Principles is the class text.
Because my training was in Applied Physics, I learned the formalism of entropy in a grad level statistical mechanics class and Dr. Mike Elzinga here and another professor of physics took me and Granville Sewell to task for using the word disorder to describe entropy.
I had less at stake personally than Granville since I didn’t frame my ID arguments in terms of the 2nd law, so it was easier for me to change my position on the topic of how to define thermodynamic entropy.
As a result I have become highly critical of the use of the 2nd law to frame ID arguments. It was the ID debate over the 2nd law that spawned my interest to take a grad school class in statistical mechanics and thermodynamics.
As an aside, I’ve become mildly critical of some ID information theory arguments as well.
I managed to get the publisher-provided free 35-page sample of your book through Amazon. Though some Amazon reviewers were critical of your work, what I’ve read from it looks excellent. I can’t imagine anyone giving it a low rating.
Unfortunately, I have no background in Organic Chemistry and had to get special instructor permission to study biochemistry. I’m going to have to pick up bits and pieces of Organic Chemistry on the side, but thankfully I have David Klein’s Organic Chemistry book. I’ve done well on assignments and quizzes so far.
I’m just a part-time evening student, and I’m likely having to self-teach myself Organic Chemistry on the side since I only have one evening free a week to go up to the NIH.
It was your Monday molecules on your blog that sparked my interest to reconsider setting aside time to study your discipline. Ironically all the discussion of the NIH ENCODE project also sparked my interest and I started attending the ENCODE related seminars to try to learn more. In the process of mingling with the researchers (many of whom were like me since many ENCODErs are computer scientists and engineers, not just biochemists and medical doctors), I realized it would be beneficial to my personal development to learn biochemistry.
So, though we disagree intensely about ID and creation, I should take this time to thank you for sparking my interest in your field of specialty. Thank you also for the years you’ve spent teaching good science to your students (I think biochemistry is an excellent field and its unfortunate that so few science students don’t get around to studying it).
Btw, I really liked some of the illustrations in your book.
Larry Moran,
I just reread your reply to Sal after hitting “send”
Most of us understand that there’s a statistical mechanistic definition of entropy and we understand that the simple metaphor of disorder/randomness is not perfect.
I realize now what you mean and how silly Sal just was. Entropy does not represent some force of Nature; ein Drang nach Disorder and that Sal was just being Sal again
It’s all good. ITMT I just remembered I forgot to credit you for the assistance you provided me on a PH worksheet which I shared with colleagues
I will correct that oversight promptly
Best
I find Einstein’s take on Boltzmann’s equation interesting
Well, it’s nice to see Salvador finally making the same arguments about entropy that I have been making for years. 🙂
But, imo, Granville doesn’t provide any more clarity as he also confuses entropy with disorder.
Personally, it doesn’t bother me if people use “entropy” for disorder — as long as they do not claim that they are talking about the second law of thermodynamics. If they claim to be talking about 2LoT, they should stick to the proper mathematical definition.
Hi Tom,
If there is some small probability of it happening, then if it does happen, there has been no “contravention” of the Second Law. There is no law that states that the highly improbable cannot happen. Hope that helps.
And the mathematics can be stated in terms of information theory, as has been known now for decades.
Which is why Salvador is still wrong when he says:
Glad to see Sal visiting entropysite, another one I introduced to him.
Notwithstanding the good comments I have for Lehninger and Dr. Moran’s book, I do have issue with this claim.
How hard is it to use Frank Lambert’s alternate qualitative definition of entropy?
That’s a lot better definition than randomness or disorder. This is easily shown in the case of the entropy change in the case of a system composed of a hot brick and a cold brick just put in contact. They are not in thermal equilibrium.
The entropy is some number before the bricks reach equilibrium. As heat energy from the hot brick is dispersed into the cold brick, the kinetic internal energy of the hot brick becomes more dispersed as some of it goes to the cold brick. The change in entropy is the change in the dispersal of energy. Simple.
Clausius used the notion of “‘transformation-energy’ to define entropy.
Understanding many of the biochemistry reactions involve changes in the biologically relevant changes in Gibbs free energy delta-G (delta-G prime, whatever):
https://en.wikipedia.org/wiki/Gibbs_free_energy
G(p,T) = H – TS
where
G = Gibbs free energy
H = enthalpy
T = temperature
S = entropy
for isothermal processes
deltaG = deltaH – T deltaS
deltatG and deltaH are expressed in Joules (energy)
T deltaS is expressed therefore expressed in terms of energy. The units of entropy are Joule/Kelvin. That’s energy per degree Kelvin. That is the transformation energy. There is no need to add confusing concepts of disorder or randomness. Reference to energy will suffice.
I don’t see why if a chemist like Frank Lambert’s can say this, other chemists can’t do the same:
Lambert’s description will help the process of learning, imho.
PS
See the hot brick problem here:
http://web.mit.edu/16.unified/www/SPRING/propulsion/notes/node40.html
As did I, over at UD, on numerous occasions. So happy to hear it has finally sunk in. All ID needs is another creationist that misrepresents entropy!
Now if you’ll just study a bit more on the relationship between information and entropy.
Hi Larry,
Do you think it is ok for Creationists to quote-mine your misrepresentation of entropy?
Do you offer any disclaimer* in your textbook?
*Disorder is only a metaphor!
No?
Thanks for this OP, Sal. Interesting and informative stuff for the scientifically ignorant (like me). I was not aware of this common conflation.
Mung
You are quote-mining and misrepresenting what Professor Moran really said:
Professor Moran did NOT say that
“*Disorder is only a metaphor! ”
What he really said was that …
“…the simple metaphor of disorder/randomness is not perfect.”
Sal has betrayed his lack of understanding by setting up another straw-man!
Check out Fick’s laws of diffusion
https://en.wikipedia.org/wiki/Fick%27s_laws_of_diffusion
1 – from a microscopic POV, a single molecule moves around randomly,
2- with more molecules, there is a clear trend (according to what Professor Moran correctly referred to as “statistical mechanics”) where the solute fills the container more and more uniformly, even though individual molecules persist in moving around randomly,
3 – and finally with an enormous number of solute molecules, randomness becomes undetectable (even though randomness still persists at a microscopic individual particle level). The solute appears to move smoothly and systematically from high-concentration areas to low-concentration areas according to the principles of probability and statistics governing random motion of MANY particles.
The SIMPLE METAPHOR that scientists object to, would be whether or not there is some TELEOLOGICAL FORCE OF NATURE operating here – which it is NOT. For example, Osmosis is not a “force” per say (as understood in scientific terminology) although the end result of Osmosis can result in a force that lifts a column of water (for example).
I have found your (and Sal’s) confusion on this matter very illustrative and useful for future lesson plans on my part.
Thanks for the quote!
For what it’s worth, that equation can be taken to mean almost anything you want. If we use natural units (aka God-made units) vs. man made units, then k = 1 which simplifies the Boltzman(Planck) equation to :
S = ln W
Let W describe anything you want (the size of the national debt, the number of petals on a flower, the number of wives and concubines of Solomon, etc.) S is just the logarithm of it!
For physics, W is the number of position/momentum microstates. It’s takes a LOT of work to understand the definition of position/momentum microstates formally. But when W is the number of position/momentum microstates, then S becomes the formal notion of thermodynamic entropy.
For the computer world W is the number of possible number of states of memory (be it RAM or a disc, or whatever). For 10 bits of RAM, there are 2^10 possible memory states of that 10 bits, so W=2^10.
Instead of natural log (ln), in information theory we use log2 (log base 2) to state the logarithm in Shannon bits, but the number of Shannon bits can be easily transformed to natural (aka God-made) bits by multiplying Shannon bits by 0.6931 ( that natural log of 2).
So if we have 10 bits of RAM, therefore
W = 2^10
therefore the Shannon entropy of 10bits of RAM is:
S_shannon_entropy = log2(W) = log(2^10) = 10 Shannon bits
We can convert this to God-made units by multiplying by the scaling factor of 0.6931
S_God-made_bits = S_shannon_entropy x 0.6931 = 10 x 0.6931 = 6.931 God-made_bits 🙂
The moral of the story:
Agree mostly out of necessity since the meaning of the word has evolved in popular culture.
Thanks for the kind words.
I’m not a scientist, but as an occasional student of science, somehow hearing creationists relating the science of steam engines (aka science of thermodynamic entropy) to “God did it” struck me as a horrific non-sequitur. I had to agonize through much formal training to find out for myself whether I was being told the truth or not. I realized I was not being told the truth by creationists (who were mistaken).
I concluded that the steam-engine science of entropy does not imply God created life.
It was personally difficult for me to see people like myself who professed creation being misled. You can see the willful ignorance on the topic of thermodynamic entropy playing out by some at UncommonDescent. It’s an interesting case of psychology how people will rather save face than admit mistakes and actually learn. The behavior of KairosFocus on the topic of thermodynamics and transfinite (thanks to Keiths and Shallit) are cases in point. I had to admit mistakes and misunderstandings along the way, take a little humiliation, so that I could learn which ideas are closer to the truth.
The creationist entropy argument was pioneered by guys like Duane Gish and Henry Morris and recently by Granville Sewell.
After learning the truth about thermodynamic entropy (aka steam-engine science) my eagerness to correct the misunderstandings of my fellow creationists damaged my standing in the community to some extent since Granville Sewell and Duane Gish are revered in the community, I’m not, I’m just a provocateur by comparison.
The flaw of the creationist entropy is rooted in the conflation of entropy with disorder, but then to my horror, I started seeing the error in respected university-level chemistry textbooks! The treatment of entropy in Lehninger and Moran’s textbooks is not the treatment of entropy one will see in grad-level physics textbooks like the one I learned from, nor in many mechanical engineering textbooks teaching how to make modern day steam engines ( aka nuclear and coal-fired power plants) or “inverse steam engines” (aka refrigerators).
I’m grateful one of the chemistry book authors, Larry Moran, pointed out the notion of “entropy is disorder” is not completely accurate, it is his “dumbed down” version.
I protested by quoting one of his peers, Frank Lambert, who has also has a “dumbed down” version too, but at least Lambert’s dumbed down version is far more accurate than Dr. Moran’s dumbed down version!
Lambert’s dumbed down version can be better related to the famous equations of entropy in the OP than Larry Moran or Lehninger’s dumbed down versions. I’m not trying to diss Larry’s illustrious career as a professor of biochemistry, but he has other dumbed down versions of entropy he can use if he’s willing to adopt the convention of one his peers, namely, Frank Lambert.
When we have a hot brick and a cold brick put in contact, the heat energy from the hot brick flows into the cold brick. The heat energy thus disperses. Entropy is a formal quantification of this dispersal.
The notion of this dispersal of dissipation was important to steam engine scientists like Clausius that were trying to relate the amount of heat energy that could be converted to work (like a steam-engine).
Unfortunately, a passing remark by the famous pioneer of entropy science, Boltzmann, has resulted in over a century of misunderstanding that is still lingering in chemistry textbooks and the creation/evolution debate.
Regarding colewd’s citation, some further remarks came to mind:
Yes, by itself
S = k ln W
is not much! It’s just taking the logarithm of one number and generating another number.
The significance however is that prior to Boltzmann the notion of S was defined differently. In fact it was defined indirectly by Clausius as delta-S.
The magic was connecting Clausius definition with Boltzmann’s definition. That discovery by Boltzmann was like what E=mc^2 was to Einstein. I related the Clausius version to the Boltzmann version here:
http://creationevolutionuniversity.com/forum/viewtopic.php?f=4&t=72
I want to thank you for the work you put into your linked posting “Entropy examples connecting Clausius, Boltzmann, Dembski”. I am learning a lot from it.
It contains the following statement, where the items 1 and 2 seem to be quoted from somewhere:
“The general method to infer design (including man-made designs), is that the object:
1. has High Shannon Entropy (high improbability)
2. conforms to an independent (non-postdictive) specification”
Was this from Demski’s work?
On the face of it, I interpret #1 as meaning the object is not common in nature, and #2 as meaning the object must match a pattern of something we know is designed, from documentation, cultural knowledge, or actually having the designer speak up and explain how he did it..
This is what has been suggested by many people (and I agree with them) trying to counter the argument that one can infer design from some inherent quality of the object.
Thank you very much for taking the time to read it.
I agree with you. I’m 85% supportive of Bill Dembski’s writings, 15% critical. By way of comparison I’m almost 98% critical of the use of the 2nd law for ID and creationism, and 2% supportive with lots of qualifications.
The purpose of the essay on Clausius, Boltzman and Dembski was to help add clarity to the claims of ID proponents. To show weaknesses and strengths in IDists using Botzmann and Shannon to support ID.
I have of late been almost 80-90% negative on IDists and Creationists trying to use Boltzmann and Shannon’s math. It’s not necessary and it adds confusion. All the mathiness doesn’t strengthen IDists claims, it adds confusion and the appearance of obfuscation.
Instead, I feel arguments from biochemistry and molecular biology will be far more fruitful, hence, I’ve decided to visit the NIH weekly to learn more about these things. It was in the process of this I ran into Lehninger’s book and was horrified to see the erroneous treatment of entropy which I now suspect is in a lot of Chemistry books.
Ironically, one does not need the right conception of entropy to do science with entropy as long as one gets the right amounts of energy in the experiments. That’s probably why the conceptual inaccuracies have persisted for over a hundred years because the conceptual inaccuracies did not result in experimental nor observational inaccuracies (the measurement of the number Joules/Kelvin of entropy).
This is a paraphrase of Bill Dembski’s earlier works, Design Inference and No Free Lunch. “Non-postdictive” is my bad choice of words, it should be “not-after-the-fact”.
Also, these formalisms add confusion and involve creation of definitions outside mainstream science and engineering. I’ve gotten more negative over time about resorting to Bill’s framing of the design argument in these terms. Even supposing he is right, the procedures of analysis are too cumbersome and hence less believable and convincing.
Also, I don’t think one can formally demonstrate intelligent design mathematically! One might, with some providence, formally demonstrate an object is the result of exceptional events. Whether those exceptional events are indicative of intelligence is a matter that doesn’t have formal resolution, but can only be accepted depending on one’s philosophical viewpoint. There is a right answer, of course, if something is intelligently designed, but I don’t think it can every formally be proven for the simple fact we can’t formally prove anyone else is a conscious intelligent being. We only accept sentience in others as a reasonable starting assumption, not because we actually have hope of formal proof…
Yes. But that doesn’t mean I fully agree with it. I should have made a distinction in my essay between Dembski’s work and my belief about design. At the time I wrote the essay, I was more favorable to his work than I am today, so I wasn’t as careful to make the distinctions I’m making now.
You point out things that are problematic for detecting designs in biology. Some of the pattern matching is valid for the man-made world, but for the God-made world other approaches are necessary, and I don’t feel Bill’s work was explored the area of detecting God-made designs adequately. His work went into extreme mathiness that wasn’t really accessible and was borderline relevant, imho.
There are easier ways to make the case for God-made designs than laid out in Bill’s work, with the caveat that there is never a formal proof that the method works, it must have some element of reasonable faith (like faith that I’m not the only sentient being in the universe).
Bill got the discussion going, and some of his ideas I borrow and try to refine. A few parts I’ve had to disassociate myself from.
I will hopefully post my alternate views of the design inference that don’t use the 2nd law of thermodynamics nor information theory or Bill’s conception of specified complexity.
I may revise my essay a bit in light of your questions.
There’s no need for a “dumbed down” version in such textbooks. A child can understand the concept of entropy.
I’m still waiting for Salvador to acknowledge that there is a connection to be made between Shannon and Boltzman.
A Farewell To Entropy: Statistical Thermodynamics Based on Information
ETA: Fixed the link. HT: keiths
Sal,
I appreciate Lambert’s efforts to expose the “entropy is disorder” misconception, but the idea he’s offering in its place — “entropy is energy dispersal” — also appears to be a misconception.
Energy is not dispersed when you take volumes of two gases at the same temperature and allow them to mix, but entropy most certainly increases.
I should add that if you substitute the idea that entropy is a measure of what you don’t know about a system — an idea advocated by Arieh Ben-Naim, the author of the book Mung mentions above — then the gas-mixing case makes perfect sense. Entropy increases without energy dispersal.
ETA: Mung’s link is broken at the moment. The book, A Farewell to Entropy, can be found here.
Well, as long as I’m shilling for amazon…
From wiki:
https://en.wikipedia.org/wiki/Entropy_(energy_dispersal)
The energy being dispersed is associated with distinguishable particles. Since the distinguishable particles are more spread out, the energy associated with them is spread out. Of course, in Lambert’s short description, this nuance is left out in what it means to say energy is dispersed. But as I said, I think this dumbed down version is superior to Lehnigner or Larry Moran’s dumbed down version.
The entropy of mixing for gases can be expressed as the sum of entropy of expansions of the pure substances as described here (unfortunately this is from a chemistry site that equates entropy with disorder):
http://rkt.chem.ox.ac.uk/lectures/entropy.html
Entropy of expansion causes a dispersal of energy in a larger volume. See the last equation here:
https://en.wikipedia.org/wiki/Free_expansion
As long as the calculations and experimental results are the same, there usually isn’t a problem even if the conceptual understanding is flawed. But for myself “entropy is disorder” hurt my learning of the subject. If I were told Lambert’s version, I would have understood stuff a lot faster.
Whatever flaw you might find in Lambert’s version, if there is any, it is a lot less than what is in most chem books I see.
IIRC, in all Physics and Mechanical Engineering books I’ve studied from, they avoid the word disorder to describe entropy. The “entropy is disorder” is mostly in creationist and chemistry texts.
Dispersal can be related to uncertainty in position of particles having kinetic energy. Increasing the volume a gas occupies increases uncertainty of the position of each particle. Thus dispersal can be expressed in terms of uncertainty.
Entropy can be described in terms of uncertainty and hence we can describe the uncertainty in terms of bits. Uncertainty is probably the most formally correct description, but that is not a dumbed down version of entropy.
Dispersal is good enough, and
S = k ln W (Boltzmann)
is the most accurate. But in practice
delta-S = Integral (dq/T) (Clausius)
is what is measured in the lab since dq and T can be actually measured.
Counting microstates (Boltzmann) is a mostly a thought exercise since we’re talking numbers like 8.43 x 10^24 microstates for merely isothermal heating water with a 1000 watt heater for 100 seconds. No one on this planet can possibly count the 8.43 x 10^24 microstates one at a time.
We could instead just state the entropy change as 268 J/K by using thermometers and electrical multi-meters.
I doubt counting 8.43 x 10^24 microstates could even be done with automated machines and super computers, and further it is un-necessary for practical applications. Measuring temperature and heat added or subtracted is sufficient in many practical cases, so the Clausius’ indirect definition of entropy suffices and so does the energy dispersal definition.
Sal,
So is disorder, if you don’t care about correctness. If you do care about correctness, then you ought to reject both — particularly when a superior alternative is available: entropy as a measure of what is not known.
(Ironically, the idea you’re rejecting — entropy as disorder — actually works in the gas-mixing case, while the idea you’re promoting — entropy as energy dispersal — does not.)
No one on this planet needs to count them one at a time. Scientists are smarter than that, Sal. Do you think Avogadro counted from 1 to 6.022 x 10^23?
Dispersal of energy is not the same thing as dispersal of molecules. In the gas-mixing case, molecules of A disperse into the volume occupied by molecules of B, and vice-versa, but energy remains evenly dispersed throughout both volumes.
Dispersal is far more accurate than disorder. Energy (as far as kinetic energy) is with reference to particles. If the particles are dispersed the kinetic energy associated with each particle is therefore dispersed (more uncertain in position). 🙄
That’s what entropy of expansion entails, the dispersal of energy because of the dispersal of particles. It’s way more correct than you give it credit for.
It’s the energy associated with distinguishable particles, therefore the energy associated with distinguishable particles is dispersed.
Furthermore if we have lot’s more of one species of gas (say helium and neon) one would not say the energy is evenly dispersed between the helium and neon species would we? We have to associate the energy with the species of particles in question. We put X joules of energy with the helium species and Y joules of energy with the neon species. The energy in the helium species (after mixing) is more spread out (which is the same as saying the uncertainty in position of the helium atoms is more uncertain).
Sal,
Entropy as missing information is more accurate than as either energy dispersal or disorder. Why reject one misconception — disorder — only to replace it with another — energy dispersal? It’s irrational.
The amount of dispersal is a measure of uncertainty in position. It is not a misconception, it is qualitatively accurate.
Sal,
The energy dispersal doesn’t change during the mixing process. Energy is evenly distributed throughout both volumes both before and after mixing.
The dispersal of each gas increases, and so does the entropy, but the dispersal of energy does not increase.
Entropy as “dispersal of energy” is a misconception. Let it go, Sal.
Energy dispersal changes for each species of substance as described by entropy of expansion for that species.
No dice.
The microstates of a system in Boltzmann’s famous formula is defined by the set of positions and momenta of each particle. If the volume available to a species of gas is expanded, the number of position/momentum microstates increases because the number of possible positions increases. We can say alternatively say an increase in volume is an increase in uncertainty in position. Or we can say qualitatively the gas is more dispersed in the bigger volume than it is in the smaller volume. If the gas is more dispersed, the greater dispersal creates more uncertainty in position of each particle, therefore more position/momentum microstates — which is the same as saying W (omega, the number of microstates) gets bigger.
S = k ln W
therefore, in the case of entropy of expasion Lambert is right.
But not necessarily between the total molecules in each species (like say between the mixing of 1000 liters of helium and 1 liter of neon), only the average energy of each molecule (independent of species). Do you not understand the difference between the total energy of an entire species vs. the average energy of the molecules in a species?
It is between distinguishable species that entropy of mixing makes sense. Btw, this is the topic of the Mixing Paradox
We put X joules of energy with the helium species and Y joules of energy with the neon species. The energy in the helium species (after mixing) is more spread out because the helium atoms are more spread out (which is the same as saying the position of the helium atoms is more uncertain in a larger volume). Same with the neon.
The change in entropy of the helium species during mixing is described by the entropy of expansion of the helium species. The change in entropy of neon species in the mixing is described by the entropy of expansion for the neon species. The entropy of mixing is the sum of the entropy of expansion of the helium species and the entropy of expansion of the neon species.
delta-S_mixing = delta-S_helium_expansion + delta-S_neon_expansion
or dare I say
delta-S_mixing = delta-S_helium_dispersal + delta-S_neon_dispersal
See a similar expression here except they use the example of
delta-S_system = delta-S_air + delta-S_argon
http://web.mit.edu/16.unified/www/FALL/thermodynamics/notes/node54.html
Since the dispersal description accurately describes the entropy of expansion, and since entropy of mixing is the sum of entropies of expansion of distinguishable species, then the dispersal description can be used accurately to describe the entropy of mixing unless one is belligerent like you and determined to save face rather than admit one might not have been right.
keiths:
Sal:
But nothing.
You endorsed Lambert’s claim that
The distribution of energy does not change in the gas-mixing scenario. Energy is evenly distributed throughout both volumes both before and after mixing.
Entropy does change, however. It increases.
If the dispersal of energy doesn’t increase, but the entropy does increase, then they cannot be the same thing. It’s simple logic, Sal.
It appears that you see your mistake, though you won’t admit it, because you are now pushing a different definition:
Did you think I wouldn’t notice the switch?
We can certainly go on to discuss your new definition, but first I’d like you to acknowledge that your original definition — Lambert’s — doesn’t work:
That’s incorrect. Agreed?
Sal
Keiths is correct here (assuming a closed system).
The spatial distribution of energy for each species changes after mixing, even if the average energy per unit volume before and after mixing does not change.
Lambert also explains what he means about energy dispersal in the case of mixing and referencing other chemists, not just himself:
Energy for the species A becomes more spatially dispersed. Energy for species B becomes more spatially dispersed. This is true even if the total energy for the entire system doesn’t change and even if the average energy per unit volume doesn’t change.
You keep ignoring the expansion entropy.
You want to re-interpret the sense of what Lambert means by dispersal, so you can win an argument, that’s up to you, but that’s not what Lambert means. You’re knocking down strawmen.
But for the readers’ benefit, let keiths show his willful misreading by trying to answer these questions.
Let there be two gas species of substances that are initially separated by a barrier and at the same temperature: Species A and Speices B.
Now after the barrier is removed and Species A and Species B mix,
does the total amount of energy in Species A change? (Answer No.) Does the total amount of energy in Species B change? (Answer No.)
Does the average number of joules/cubic meter in the system change? (Answer No).
Do the particles in Species A become more spatially dispersed? (Answer Yes).
Do the particles in Species B become more spatially dispersed? (Answer Yes)
So every particle of Species A has a kinetic energy associated with it? (Answer Yes, 1/2 mv^2).
Is the sum total of energy in Species A distributed to every particle in Species A? (Answer Yes)
So the energy of Species A is more spatially dispersed ( joules/cubic meter) after mixing since the particle of Species A are more spatially dispersed (Answer Yes).
So the energy of Species B is more spatially dispersed ( joules/cubic meter) after mixing since the particle of Species B are more spatially dispersed (Answer Yes).
So the energy of each individual species can be more spatially dispersed even if the average energy per unit volume in the system before and after mixing stays the same? (Answer Yes).
Keiths is wrong and refuses to recant as usual.
I’ve referenced web sources that show the math. I’ve referenced Lambert describing dispersal applies to the mixing. If Keiths wants to misread what an author said, that’s up to him, but the author himself covers the case of mixing. I’ve explained what Lambert and other mean by energy dispersal in the case of mixing of distinguishable species. The energy of each species is more dispersed.
I could work through examples with actual numbers.
Before going to that trouble, could you just confirm whether you think the overall energy of the system has changed from before to after mixing.
I’m 37.5% enjoying your percentage estimates. My confidence in them is down in the teens, though.
Below is depiction of two gas species before they mix.
If Gas A is helium and Gas B is neon, does the entropy increase after they mix? Answer: Yes.
Is there a change in average energy per unit volume in the system even though there is a change in entropy? Answer: No.
If Gas A is helium and Gas B is also Helium? Does the entropy increase after they mix? Answer: No!
Is there a change in average energy per unit volume in the system even though there is no change in entropy? Answer: No.
If Gas A is helium, why is there an entropy change when Gas B is neon, but not if Gas B is helium? Entropy change can happen only if Gas A and Gas B are distinguishable (like Gas A being helium and Gas B being neon). The distinguishability of the gas molecules allows statements to be made about the spatial distribution of energy residing in specific species of molecules (be they helium molecules or neon molecules). I will show this with some numerical examples to help the reader.
The importance of the gases being distinguishable is highlighted here in the mixing paradox:
https://en.wikipedia.org/wiki/Gibbs_paradox#The_mixing_paradox
But Sal, you originally asked:
Why, then, does the entropy change critically depend on whether the two species are distinguishable? You claim:
Even if the gases are indistinguishable, you can still “make statements” about the spatial distribution of energy residing in molecules that came from each original subcontainer.
But you’ll get the wrong answer, because keiths is correct.
I’ll get me coat.
stcordova,
Just ask yourself, when looking at your diagram, what difference it makes if, instead of gas B, there is the same amount (an equal number of molecules/atoms) of gas A.
Addendum:
Keiths folly can be better borne out by letting Gas A be a mono-atomic ideal gas (something like helium), and Gas B a diatomic ideal gas (something like O2).
Refer to this description for the internal energy of volumes of Ideal gases:
https://en.wikipedia.org/wiki/Ideal_gas#Classical_thermodynamic_ideal_gas
Suppose we have 1 mol of Gas A (mono-atomic) in 1 cubic meter on the left side of the system and 1 mol of Gas B (di-atomic) in 1 cubic meter on the right side of the system before mixing and all gases at 300K.
According to the formula for ideal monoatomic gases, the total internal energy U of Gas A on the left side is defined by
U = cv n RT
since Gas A is monoatomic, cv = 3/2, the internal energy of Gas A is:
(3/2) (1 mol) (8.314 x J/K/mol) (300K) = 3741.3 J
According to the formula for diatomic ideal gases, since Gas B is diatomic, cv = 5/2, the internal energy of Gas B is:
(5/2) (1 mol) (8.314 x J/K/mol) (300K) = 6235.5 J
The energy per unit volume on the left side (Gas A) is 3741.3 J/cubic meter 6235.5 J/cubic meter on the right side (Gas B) before mixing.
After mixing, the energy per unit volume due to the molecules of Gas A is
(3741.3 / 2) J/cubic meter = 1870.65 J/cubic meter
for Gas B
(6235.5 / 2) J/cubic meter = 3117.75 J/ cubic meter
The total energy of the system is the sum of the energy in the gases:
3741.3 J + 6235.5 J = 9976.8 J
So after mixing, the average energy per volume is
(9976.8 /2 ) J / cubic meter = 4988.4 J/ cubic meter
But that figure makes no sense before mixing because clearly the energy is NOT evenly distributed before mixing even though we’re dealing with the same number of molecules (measured in moles) of each species of gas at the same temperature!
Before mixing, the energy per unit volume on the left side is 3741.3 J/cubic meter and on the right side is 6235.5 J/cubic meter. Clearly the internal energy before mixing is not evenly distributed!
This describes how the energy of Gas A becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 3741.3 J/cubic meter to 1870.65 J/cubic meter. Likewise the energy of Gas B becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 6235.5 J/cubic meter to 3117.75 J/ cubic meter.
Thus Keiths is clearly wrong on so many levels.
Sal,
Introducing a diatomic gas can’t help you when you’re already wrong about the monatomic case that you yourself specified:
Take a look at your second sentence. You are making my point for me.
Entropy can’t be a measure of energy dispersal if entropy increases while energy dispersal remains the same.
You’re conflating the average energy of the solution with the average energy of the species!
You want to misread what Lambert is trying to say, that’s up to you, but that’s a strawman, and his colleagues would call you out on it because they (like Levine) agree with his definition and Levine even specifically referenced the mixing situation.
The energy dispersal is with respect to the energy of the species not the energy dispersal of the mix of species.
I can do the numbers with monoatomic ideal gases too. Unlikely it will click for you, but it might click for the readers.
Suppose we have 1 mol of Gas A (mono-atomic) in 1 cubic meter on the left side of the system and 1 mol of Gas B (mono-atomic) in 1 cubic meter on the right side of the system before mixing and all gases at 300K.
According to the formula for ideal monoatomic gases, the total internal energy U of Gas A on the left side is defined by
U = cv n RT
since Gas A is monoatomic, cv = 3/2, the internal energy of Gas A is:
(3/2) (1 mol) (8.314 x J/K/mol) (300K) = 3741.3 J
According to the formula for mon-atomic ideal gases, since Gas B is diatomic, cv = 3/2, the internal energy of Gas B is:
(3/2) (1 mol) (8.314 x J/K/mol) (300K) = 3741.3 J
The energy per unit volume on the left side (Gas A) is 3741.3 J/cubic meter 3741.3 J/cubic meter on the right side (Gas B) before mixing.
After mixing, the energy per unit volume due to the molecules of Gas A is
(3741.3 / 2) J/cubic meter = 1870.65 J/cubic meter
for Gas B
(3741.3 / 2) J/cubic meter = 1870.65 J/ cubic meter
The total energy of the system is the sum of the energy in the gases:
3741.3 J + 3741.3 J = 7482.6
So before and after mixing, the average energy per volume is
(7482.6 /2 ) J / cubic meter = 3841.3 J/ cubic meter
This describes how the energy of Gas A becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 3741.3 J/cubic meter to 1870.65 J/cubic meter. Likewise the energy of Gas B becomes more spatially dispersed after mixing resulting in a volume kinetic energy density change from 3741.3 J/cubic meter to 1870.65 J/ cubic meter.
Thus even in the monoatomic case, Keiths is shown to be clearly wrong because the volume kinetic energy density change for each speices of gas reduces from 3741.3 J/cubic meter to 1870.65 J/ cubic meter. This happens because the energy for each species becomes spatially dispersed, just as Lambert said.
In the case of Gas A being identical to Gas B (like Gas A helium, and Gas B helium), because the species are not distinguishable and thus interchangeable, the volume kinetic energy density change for each speices of gas remains at 3741.3 J/cubic meter before and after mixing, in agreement with the mixing pardox.
https://en.wikipedia.org/wiki/Gibbs_paradox#The_mixing_paradox
You disagree with me, here is a chemistry tutorial that basically conveys the same approach I used:
http://www.chem1.com/acad/webtext/thermeq/TE4B.html
and
Now let us repeat the experiment, but starting this time with “red” molecules in one half of the container and “blue” ones in the right half. Because we now have two gases expanding into twice their initial volumes, the changes in S and G will be twice as great:
ΔS = 2 R ln 2, ΔG = –2 RT ln 2. But notice that although each gas underwent an expansion, the overall process in this case is equivalent to mixing.
So this shows other chemists saying generally what I just said. If you want to misread and not realize the dispersal refers to the energy in the individual species during the mixing process, that’s up to you, but I was able to figure out how Lambert was using his terms and so have others.
You want to define for Lambert what he meant by what he said. Not cool. That’s because you’re keiths, and you can’t ever let yourself admit publicly at TSZ that you might be wrong about anything.
So says the guy who has problems understanding 1.5% does not equal 0.1%:
But regarding your question, and you obviously weren’t bothering to read the links I was giving Keiths:
https://en.wikipedia.org/wiki/Gibbs_paradox#The_mixing_paradox
Read and weep.
Oh, I read the links before you cited them, Sal.
Precisely. Thank you for making my point for me. But you failed to answer my question: WHY, under your “dispersal of energy” definition, does the entropy of mixing depend on whether the gases can be distinguished?
Under keiths’s definition, the difference between the two experiments follows inexorably, as demonstrated by your “red” vs “blue” thought experiment. Under your definition, there’s no difference (in energy dispersal) between the red vs blue case and the ‘left-side’ vs ‘right-side’ case. You are reduced to creating an ad hoc “but they’re not different, so I apply the equations differently” get-out-of-jail card.
You have shown no sign whatsoever that you understand the basis for this get-out-of-jail card.
ETA: I did enjoy re-reading that SINES thread. Thank you for the reminder 🙂
Obviously not. He is still counting, and will be for some time yet.
Coins then?