The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
This seems to be yet another one of Sal’s mindless assertions, based on his apparent ignorance of anything beyond High School physics and chemistry concepts. Clausius / Carnot is used to introduce high school students to entropy using dQ/T to discuss the theoretical efficiency of steam engines (well, ideal gases in pistons), and to explain to teenage chemistry students why endothermic reactions may be favored.
Professional chemical engineers and mechanical engineers are more likely to be concerned with the economic efficiency of a process rather than the thermodynamic efficiency (e.g. ammonia is synthesized at high temperature: the equilibrium position is worse, but you get there much quicker) so activation energy is what drives the choice here. Engineers who design heat exchangers care about rates of heat transfer, not the entropy.
And once we start talking about scientists, then dQ/T is rarely informative: chemists talk (sometimes somewhat loosely) about ‘freezing out’ degrees of freedom (chelate effect) to make binding less entropically disfavored, or the benefits of a conformationally constrained ligand. That’s counting available microstates all the way; there’s no talk of dQ/T…
I could ask my physicist and materials scientist buddies what they use, but I somehow doubt that would change anyone’s opinions…
There is only one way to define entropy correctly: minus rho log(rho).
If only they could put the entropy in a bottle and sell it.
For Sal.
walto,
To say that the information is missing means that an observer lacks it. (The information isn’t missing from the world, after all.) Another observer may lack a different amount of information. A third observer may lack none at all.
Entropy is a measure of missing information: the gap between the information contained in the macrostate and the information required to pin down the exact microstate. The choice of experiments/measurements determines the macrostate, and the entropy is a function of the macrostate.
Yolanda measures isotopic concentrations, while Xavier does not. They end up with different macrostates and therefore different entropies. The difference is not in the system — they’re observing the same one.
The difference is in the macrostates, which are observer-dependent. Entropy is observer-dependent.
keiths,
I mostly agree with that post. I’d put two things a bit differently, though. And, as said, this is a tempest in a teapot.
I suppose it’s OK to put it that the macrostates are different as between the assessments of the two observers, but only if one remembers if one uses that sort of locution, “macrostates” are maps, not territories. I prefer to think of macrostates as states of the world, supervening on microstates, and the world doesn’t change from Xavier’s observation to Yolanda’s. The entropies do, however, because the the entropies they calculate are a function of the information each observer has and lacks. One has more, and so calculates a lower entropy. So I think it’s less misleading to think of the entropies as maps than as territories, because the world is as it is.
Recognize both sides of this coin–the necessity of the physical properties and the delimitation of the relevant ones on one side, and that the calculation is a matter of measuring what is not known about the microstates from this or that perspective, and the this whole matter is little more than a quibble. Exciting for those who like them (or like to fight), but, IMO, likely to be of little interest either to philosophers, or to working scientists.
Quoted by walto. Glad to respond because Walto brought it to my attention.
I took graduate level statistical mechanics and thermodynamics from John Hopkins University Whiting School of Engineering in addition to undergraduate thermo dynamics for mechanical engineers which I took as an elective as a student of electrical engineering.
You’re welcome to remain in your world of self delusions about my educational attainment in these topics.
I provided three different ways to calculate entropy from numerous examples.
I studied under physicists and material scientists at the Applied Physics Lab of JHU.
Bloviate away DNA_jock. How about you provide for the readers examples of your skill in computing the entropy change of a melting ice cube using your knowledge of Boltzmann microstates or Shannon entropy to the exclusion of the Claussius definition that involves heat and temperature.
Here is entropy measurements at the molecular level being done on protein ligand interactions. Do you know why we do things empirically with Claussius vs. Shannon? We have to confirm our theoretical models, and complex folded molecules like proteins are pretty hard to model and compute the number of Boltzmann microstates involved.
Show for the readers how you do this measurement with your non-existent
“missing information” meters or your Boltzmann microstate counters that you and Keiths swear by.
This is a good example why you’re on my ignore list. You’ll insist I don’t know something, you’ll delude yourself that I only know high school level statistical mechanics and thermodynamics because you refuse to take my word for it that I actually have more academic training than that. Wallow in your self delusions, but I’m not eager to spend time responding unless Walto or colewd request a response from me regarding your charge.
The paper referenced above talked about Differential Scanning Calorimetry which was used to measure(infer) entropy change.
In contrast, you’re welcome to appraise the readers of your non-existent “missing information” meters or microstate counters since you present yourself as having expert knowledge in these matters. Go ahead, DNA_jock, or how about you just buzz off this discussion since you’re not adding much to it.
How do we measure the information contained in the macrostate?
Tell that to Dr. Mike Elzinga who is a physicist and professional in thermodynamics!
So how do you count the microstates DNA_jock? LOL!
How do you confirm you actually counted them right? Like, ahem using calorimiters and thermometers.
Baloney! I just gave an example of analysis of ligand and protein binding and measuring entropy, it was done with calorimeters and thermometers, the old fashioned Claussis way, not using your non-existent microstate counters.
How do you know what the physical degrees of freedom are for proteins. Have you solved all protein folding problems? LOL!
https://www.ncbi.nlm.nih.gov/pubmed/9541869
You better bust out those non-existent “missing information” meters and those non-existent microstate counters, DNA_jock, because it may not be so easy to calculate the microstates of proteins from first principles.
Tell that to Dr. Mike.
All that education and Salvador still can’t calculate the missing information.
Yeah DNA_Jock. Have you solved all protein folding problems? Have you? So there!
Oh, Sal,
You appear to have shot yourself in the foot again. Another indication that you took all those fancy classes, but still don’t seem to have learnt anything.
In your attempt to refute my statement that
you cite Karplus and Janin, 1999, quoting them thus:
Reading your quote, the word “purports” jumps out at me. What are the authors actually saying here? Well it’s only a one page commentary, so I encourage readers here to look for themselves…
Here’s the paragraph preceding the one that Sal quote-xxxxx:
It’s all about degrees of freedom, and nary a mention of dQ/T…
Karplus and Janin go on to explain that Tamura and Privalov did the wrong experiment.
Sal has, once again, conflated an experimental technique (Differential Scanning Calorimetry) that is used in an attempt to measure entropy changes (in this case, unsuccessfully) with the underlying understanding about what is actually going on: the freezing out of degrees of freedom, including the hydrophobic effect. You may have missed my mentioning the latter, what with the fingers in the ears an’ all.
You are making my point for me. Yet again.
Oh, thermo courses for engineers.
My father and brother got their engineering degrees from a university that, almost uniquely, requires their students to understand the equations they plug values into.
Either JHU falls in the “just plug in the numbers” category, or Sal is not representative of their engineering graduates. Or both.
From Labmert’s website.
Lambert could have gone into the complexities of microstates and degrees of freedom, but he gave a common sense explanation.
And I highlight again this simple echoing of Claussius by Lambert:
Let DNA_jock, Keiths, and Mung explain entropy like this with “missing information”, let them explain the differences in entropy of graphite and diamonds with “missing information” in so clear and lucid a way as Lambert did.
Sal,
You’ve also described yourself as “a mediocre student of science and engineering at best”. Based on your performance in this thread, I think we can all agree on that.
The question is, are you capable of learning from those who are not mediocre students?
From wiki entry on heat exchangers:
http://www.sciencedirect.com/science/article/pii/S0360544213008074
Maybe I should undo my ignore button so I can see the latest ways DNA_jock embarrasses himself. Hahaha!
walto,
From that comment, it appears that you now concede that entropy is observer-dependent.
Just to be clear, do you now accept that both Xavier and Yolanda correctly calculate the entropy, and that your earlier claim — that Yolanda’s entropy value is more accurate than Xavier’s — is incorrect?
Mung,
You don’t have to. You can go straight to the difference — the entropy.
Take the “odds before evens” macrostate for the deck of five cards:
There are only 12 microstates that are compatible with the macrostate. Each of them is equally probable. The entropy is therefore log(12).
That is, the entropy is the difference between knowing the exact microstate and knowing that it is one of 12 equally probable possibilities.
stcordova,
Wow, just wow!
The statement you highlight
is wrong. Whether it is stable or not is not the point; a (reasonably stable) state is chosen by convention and must be specified. His use of “dispersed” is rather gratuitous, too. Energy “absorbed” would do just fine. Is he trying to argue in favor of a particular view, ya think?
Then the howler:
No it doesn’t “require some energy”. Just think man! The mere fact that the atoms CAN slide over each other cannot require more energy. Only if they DO so move is any energy involved. It’s an additional degree of freedom, and therefore, there are more microstates, and there’s higher entropy in graphite than diamond.
Yikes.
Regarding heat exchangers, I’m guessing that Hajmohammadi et al. added that bloviation about entropy in order to get their goofy paper published in the Journal “Energy”. Color me underwhelmed by the result that a 180° bend causes a greater pressure drop than separated 90° bends. It’s the pressure drop that matters.
I mean the paper begins:
Say what? Not since 1923, they aren’t.
Yeah. Someone could make an Excel spreadsheet that pumps out the results of calculations. Oh. Wait.
HEY!
My suggestion is that you undo your Ignore button and just try to have a normal discussion.
keiths: Entropy is a measure of missing information: the gap between the information contained in the macrostate and the information required to pin down the exact microstate.
Mung: How do we measure the information contained in the macrostate?
keiths: You don’t have to. You can go straight to the difference — the entropy.
Entropy is a measure of the information gap. That is what you wrote. If I don’t know where I started, and where I ended up, then how do I know what the difference was?
Information contained in the macrostate.
Information contained in the microstate.
Measure the difference
Call it Entropy.
How do you decide the amount of information contained in the macrostate? If you can’t say, I’ll understand completely. All along in this thread I think you speak too loosely and fail to make important distinctions.
Let me ask a little differently. How do you quantify “the amount of information” present in the macrostate? That quantity would seem to me to be very important to any who thinks entropy is qualitative.
So all substances are energy dispersal sinks, else they would not exist in any stable form? I readily confess my immeasureable ignorance here.
So in the “energy dispersal” view energy is dispersed from the environment (whatever that means) and into substances? How does that work for an isolated system?
Salvador, can you explain why the authors of the textbook Molecular Driving Forces: Statistical Thermodynamics in Biology, Chemistry, Physics, and Nanoscience are mistaken?
Widely adopted in its First Edition, Molecular Driving Forces is regarded by teachers and students as an accessible textbook that illuminates underlying principles and concepts.
: What Does a Partition function Tell You?
The partition function is the connection between macroscopic thermodynamic properties and microscopic models. It is a sum of Boltzmann factors that specify how particles are partitioned throughout accessible states.
– Molecular Driving Forces, p. 176
LoL.
The Boltzmann distribution says that more particles will have low energies and fewer particles will have high energies. Why? Particles don’t have an intrinsic preference for lower energy levels. Fundamentally, all energy levels are equivalent. Rather, there are more arrangements of the system that way.
– Molecular Driving Forces, p. 173
Sorry Sal.
Moved a comment to Guano. Please address the ideas and not perceived proclivities of particular participants.
ETA: More alliteration.
It is very difficult to get an exact calculation of microstates, because most mathematical models are idealizations to make them tractable. Sometimes if the system is simple, we can get a good approximation and calculate microstates, but the empirical measurement has the final say about the accuracy of the model, that’s why this is a silly statement:
If we knew specific heats exactly via computation we can compute microstates to the Nth decimal, but we can’t, our math models are idealizations. Where are DNA_jock’s microstate counting machines?
Hence one cannot count microstates “all the way” as DNA_jock asserted. The only way I counted microstates for the case of a melting ice cube was to first compute dQ/T, so I really didn’t count them did I?
DNA_jock has yet to show he can actually count the microstates of a simple melting ice cube accurately without dQ/T!
When double Nobel prize winner Pauling computed the entropy of ice (aka counted microstates), it wasn’t counting all the way. Pauling was pretty accurate, but not exact. Even Pauling had to reference the experimentally measured values to confirm his count of microstates was reasonably accurate!
http://www.uni-leipzig.de/~biophy09/Biophysik-Vorlesung_2009-2010_DATA/QUELLEN/LIT/A/B/3/Pauling_1935_structure_entropy_ice.pdf
Of course we know more now than back in 1935, but it is not so easy to actually count microstate! We use the empirical data to see how accurate our mathematical idealizations or computational numerical models are. That’s why this is a ridiculous statement:
Tell that to Dr. Mike, tell that to Linus Pauling.
The empirical measurements of entropy supported the hypothesized molecular models of ice structure, not the other way around. We sometimes don’t have an accurate model of the molecular structure first that allows us to count microstates, we have to make measurements of entropy and then that informs us if our math model is correct or not. We compute the count of microstates from the math model, and then we compare it to the empirical measurements of entropy to see if we counted correctly and hence had the right model. DNA_jock’s comment is a mess.
ROTFL! How do you think Linus Pauling concluded the 4-hydrogen-bond model of ice was likely correct? Er, he referenced the actual measurments of internal energy (from which we can compute entropy).
If we could actually count microstates exactly or specific heats, what’s the need for Differential Scanning Calorimeters used by chemists and physicists and material scientists, etc.
In addition to differential scanning calorimetry we have other techniques to measure dQ/T.
Take a relatively simple phase state, gases. We have idealizations of mono-atomic, di-atomic, poly-atomic. From specific heats we can compute entropy, but note, the specific heats derived by idealized math models of the ideal gases do not agree exactly (albeit pretty closely) with empirical measurements.
See for yourself that there is a slight difference between the theoretical specific heat (and hence entropy) and the actual specific heat. The Sakur-Tetrode equation is only an approximation and will definitely fail when the gases cool and liquefy or solidify.
You can see the estimates of the specific heat (and thus the resulting entropy figures) aren’t exact, not too bad for mono and diatomic gases, but starting to get awful for poly atomic gases. Things will generally get worse for the solid state, and nightmarish for the liquid state of substances.
Bottom line, DNA_jock’s comment is stupid. There is a reason we have Differential Scanning Calorimeters and other fancy ways to measure heat and temperature, because we don’t count microstates all the way.
Patrick:
The Guanoed comment can be found here.
The following paper shows no mention of microstates, despite this, DNA_jock says entropy is deduced by professionals by “counting available microstates all the way; there’s no talk of dQ/T” .
I’ve pointed out thermometers and calorimeters are the ways entropy is measured by chemists. Entropy is not measured by non-existent microstate counters or inaccurate microstate counting via idealized (and therefore inaccurate) math/physics models that DNA_jock swears by. Neither is it measured by “missing information” meters that Keiths needs to swear by.
Chemists often measure entropy the old fashioned way (energy and temperature) and use
ds = dQ/T
or some variant thereof like:
TΔS=ΔH−ΔG
which is rooted in the Claussius definition of entropy
ds = dQ/T
as shown in the derivation here eq. 1:
https://en.wikipedia.org/wiki/Gibbs_free_energy
In any case, here is a paper by professional chemists that implicitly uses the definition of entropy in terms of Claussius dQ/T.
despite these facts
To embarrass DNA_jock, there are more examples like the above paper from biophysics, biochemical, and chemical literature.
Uh, DNA_jock, maybe I need to connect the dots for you. The above paper mentions T for temperature as well as heat changes — dQ is implicitly inferred from the change in gibbs free energy and change in enthalpy using calorimeters, etc.). That means the paper has data for dQ and T. It doesn’t measure entropy by “counting microstates all the way”.
This dQ and T data can be plugged into the equation of Gibbs free energy and derive the entropy.
dS = dQ/T is embedded in the formulas (like TΔS=ΔH−ΔG) of the paper cited, but apparently you need to have this spoon fed to you since you make idiotic comments like this:
ROTFL!
On the contrary, dQ/T gives us insights into how the molecules may move and connect since many times we can’t see them directly. We build models of how we think they connect and move, and these models give us counts of microstates. If the entropy computed from the microstates of the model match up with the measured entropy using dQ/T, then our confidence in the structural details of the molecular system is supported.
In any case, this has to be a classic dumb statement by DNA_jock:
Hahaha! I’m going to put that one in my trophy case.
Oh goody. A quote mine.
I’ll see yours and raise you one of my own:
DNA_Jock:
Sal’s strategy seems to be…
1) to avoid addressing the reasons why entropy cannot be a measure of energy dispersal by…
2) putting people (or pretending to put them) on “ignore”…
3) while trying to pass off the Clausius quantity dQ/T as “energy dispersal”, when it clearly isn’t, and…
4) overlooking the fact that the Clausius equation simply doesn’t work in many cases, such as the gas-mixing cases we’ve been discussing throughout the thread.
Not very impressive. Where did he get the idea that he was competent to explain this stuff to his fellow creationists and IDers?
And yet:
Molecular Driving Forces p. 109
Better yet, to correct them. To insist that they are wrong and he is right.
But, he does have a spreadsheet. Do you have a spreadsheet?
Yup.
You keep making my point for me. Your continued failure to recognize the difference between a measurement technique and our understanding of the underlying system is noted.
If only you took your fingers out of your ears long enough to read my posts, you might, just might, stop shooting yourself in the foot.
A point of interest for colewd: Notice how Lambert defines the entropy of a “substance” to be zero at 0K,
which carefully avoids the problem (for his energy dispersal approach) that the entropy of a mixture at 0K is not zero.
Hope this helps.
Nuff said. Hahaha!
It’s also worth reiterating that Sal’s own example — of a single helium molecule in a container — shows that entropy is not a measure of energy dispersal.
The energy of the system is concentrated in a single molecule. Change the size of the container and the entropy has changed. What about energy dispersal? Exactly the same as before; all the energy is concentrated in one molecule.
Entropy changed, but energy dispersal did not. Entropy is not a measure of energy dispersal.
Unlike keiths, I don’t think it makes any sense to speak of the change in entropy of a thermodynamic system composed of a single molecule. But that is what happens when you conflate the Shannon measure of information (SMI) with thermodynamic entropy.
Mung,
Why not?
If one molecule isn’t enough, how many are required, and why?
From the DNA_Jock school of thermodynamics:
🙂 🙂 🙂 🙂 🙂
Statistical Thermodynamics, by Juraj Fedor (2013) says that one molecule IS enough:
But there is more than one that is possible. Since according to Feder, entropy is the number of microstates having the same specified amount of energy, it makes sense to talk about the entropy of a single molecule.
I note that Feder also says that
Yes, absolutely. You are a genius, Walto!
What were your textbooks?
But keiths isn’t talking about the entropy of the molecule.
keiths: Change the size of the container and the entropy has changed.
The entropy of what?
FWIW,
I discussed single particle entropy here:
As an addendum, it is harder to establish dQ/T for single particle entropy unless one appeals to AVERAGE behavior over many trials, one thought experiment is too small a sample size, one needs many hypothetical thought experiments to get the law of large numbers to work for the Claussius definition.
Single particle entorpy works nicely however for the Boltzmann definition.
You can also plug single particles into the Sakur-Tetrode qunatum mechanical version of the Boltzman equation for idealized monoatomic gases.
LoL.
Oh OK. Should have stayed out of it. Sorry.
No, you gave an interesting quote, and Sal chimed in which led to even more interesting quotes. It’s all good. 🙂