The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:
The randomness or disorder of the components of a chemical system is expressed as entropy,
Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.
Gag!
And from the textbook written by our very own Larry Moran:
enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.
Principles of Biochemistry, 5th Edition, Glossary
Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto
Choke!
Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:
On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.
…
“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?
Dan Styer criticizing Granville Sewell
Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.
Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:
Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.
Mike Elzinga
2lot trouble
Hear, hear.
But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.
“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.”
Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.
Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:
That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?
Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”
There is no basis in physical science for interpreting entropy change as involving order and disorder.
So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!
Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu
April 2014
The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..
Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:
The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham
…
Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002
…
A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate
Here is chemistry professor Frank Lambert’s informal definition of entropy:
Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.
The is the correct Ludwig Boltzman Entropy as written by Planck:
S = k log W
where
S = entropy
k = boltzman’s constant
W = number of microstates
Also there is Clausius:
delta-S = Integral (dq/T)
where
delta-S = change in entropy
dq = inexact differential of q (heat)
As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”
My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.
Let Mung and Keiths show their procedure for computing increase in “missing information” entropy when a 20 gram cube of ice melts.
Go on guys. You claim it’s a practical definition so show how you compute this missing information.
In contrast, if you do an experiment to get the enthalpy of fusion like this high school experiment:
https://www.greenwichschools.org/uploaded/faculty/andrew_bramante/GHS_Honors_Chem_Heat_of_Fusion_Experiment.doc
you should get a value of 333.55 J/gram
https://en.wikipedia.org/wiki/Enthalpy_of_fusion
If you have a 20 gram ice cube, simply divide the enthalpy of fusion energy by the temperature of melting (273.15 K) to get the entropy change according to Claussius:
dS = dQ/T
thus entropy S is:
S = Integral (dQ/T) = Q/T for constant temperature of 273.15K
Q = (20 grams x 333.55 J/gram) = 6671 J
thus,
S = Q/T = 6671 J/ 273.15 K = 24.42 J/K
which is (minus some rounding errors) the figure I’ve shown repeatedly. (Frazier University used 333 J/g as the Enthalpy of fusion, hence a slightly smaller number of 24.39 J/K).
Mung and Keiths have yet to show how they compute “missing information”.
I mean, how do you guys measure your own ignorance about the details of the melting of 20 gram ice cube. Not so easy, eh? That’s because your “missing information” approach has little practical methods, it’s all handwaving.
Go on, show the readers how you figure out how ignorant you are about melting ice cubes. Once you put a number on your amount of ignorance (aka “missing information”) , translate that into thermodynamic entropy. At least get in the ball park of:
24.39 J/K to 24.42 J/K
But you guys are nowhere to be seen when it comes to simple questions like this high school example. Pathetic!
keiths:
Sal:
The “entropy as disorder” and “entropy as energy dispersal” definitions are incorrect, Sal. I reject them because they’re wrong.
It’s that simple.
Here’s the elephant in the room, Sal. If you want to defend the energy dispersal interpretation of entropy, you need to address this comment:
Let’s see your refutation, Sal.
Also, I’m astonished that you still don’t understand the meaning of ‘dispersal’.
You keep referring to Clausius as if dQ/T were a measure of energy dispersal:
It isn’t. For energy to disperse, it needs to spread out over a larger volume. dQ/T is energy divided by temperature. Are you going to argue that the energy is spread out over a large number of kelvins, and that counts as dispersal? It’s inane.
If entropy were a measure of energy dispersal, it would have the units of energy dispersal: energy per volume.
Keiths,
Too funny you’re quoting Dr. Mike since you said his “energy spreading” definition of the 2nd law of thermodynamics was wrong.
Keiths, Mung, and DNA_jock:
Show the readers how you figure out how your ignorance about the details of a 20 gram ice cube increases after it melts, quantify that amount of ignorance (aka “missing information”) in bits and then show how that relates to the standard answer that should be in the ball park of:
24.42 J/K
Still no answers after 1500 some comments? I showed the energy dispersal answer, now your turn guys to show the answer with your “missing information” procedure.
🙂 🙂 🙂 🙂 🙂
And don’t forget what you learned in the DNA_Jock school of thermodynamics:
So show all those students of science how it really should be done instead of calculating entropy change the Old School way with thermometers and calorimeters.
I take it that “energy dispersal” is considered by the prevailing theory to be equivalent to dQ/T though, isn’t it? Based on the prevailing theory of heat, you find one, you get the other.
Now, you disagree with this equivalency claim, I know, because you have mentioned it about 730 times, but so fucking what? It’s a quibble. Find something useful to do, like maybe answer my questions, do Sal’s calculations, or Mung’s programming request, or patrick’s challenge.
Everyone knows your quibbling fucking view on this matter. It’s uninteresting, and not only because completely trivial and useless. There’s also the fact that you’ve repeated it over 700 times now. It’s an old, useless, repetitive quibble which you obviously cannot defend, but will just repeat until everyone goes away.
Firstly, he doesn’t have an argument, just an assertion.
People use the Clausius equations because they are a reasonable approximation to reality under most circumstances and the math is easier.
There are two problems with Sal’s “practical” application of Clausius entropy concepts. 1) Clausius does NOTHING to explain WHY “heat cannot, of itself, pass from…” 2) Under certain circumstances, Clausius gives the wrong answer: you need to understand the Law of Large Numbers reasons that underpin entropy in order to be able to spot when the simple math could be leading you astray — the analogy to Newton’s Laws is apt — good enough for Apollo, but not for GPS, and you can bet your bottom dollar someone at NASA calculated γ for the moonshot, just to make sure.
3) It’s a fallacy to judge a theory based on…
Ugh!
I’ll come in again.
stcordova,
Cool calculation, bro.
But what if the temperature is, say, 250 K ?
(This is an important question in meteorology.)
You could
A) assume the entropy change is temperature-independent, thus the ΔH will vary linearly with T.
Or
B) you could apply Kirchhoff’s Law of Thermochemistry. (that is: ∂ΔH/∂T = ΔCp, the difference in heat capacities, FYI)
Or
C) you could measure it with one of those fancy DSC machines.
How many different answers will you get and, most importantly, WHY?
I don’t wish to get in the middle of your
flame warconversation, but this is an over-simplification.The ‘dispersalists’ need to come up with fancy-schmancy definitions of “dispersal” to explain entropy of mixing. Clausians have to back into the fact that there must be an entropy of mixing, because you have to expend “useful work” to unmix stuff.
DNA_Jock,
I can see that it might be hard to define ‘dispersion’ in such a way that it does everything the traditional Clausian might want. Also that there’s no real point to doing so or for staking much on defending that view. But I must say that this fight seems silly to me.
The pointlessness (and this fucking election) is my excuse for being so flamey. Sal is not going to respond to your corrections and Mung will continue to take pot shots at Sal and keiths is not going to respond to anything, and the entire issue seems so trivial that it could be sent to the BBB for mediation and be resolved in 8 minutes–if only the participants had the slightest interest in any meeting of the minds. The prevailing attitudes distress me.
So, in a word, yes, I over-simplified.
and from Walto:
From Lambert’s writings:
I’ve used capital “Q” to mean qrev (q-reversible).
Contrast the words of a retired professional chemist and chemistry professor like Lambert with DNA_Jock’s school of thermodynamics:
So, let’s take Lambert’s own words and apply them to the 20 gram melting ice cube.
ΔH = change in enthalpy
From wiki:
In the case of condensed matter like liquid water and ice, the pressure and volume can be mostly neglected in the computation of enthalpy of fusion (even though there is a slight change in volume when ice melts).
The enthalpy of fusion per gram is 333.55 J/g as attested here:
https://en.wikipedia.org/wiki/Enthalpy_of_fusion
Thus for a 20 gram ice cube melting at 273.15K, the enthalpy of fusion is
ΔH = 20 g x 333.55 J/g = 6671 J
Since this is an isothermal, isochoric, isobaric process
ΔH = ΔQ = 6671 J
thus the change of entropy at 273.15K is:
Integral(dQ/T) = ΔQ /T = 6671 J/ 273.15 K = 24.42 J/K
I’ve thus shown one can take the sense of what Lambert actually said (versus Keiths’ misreading, mis-interpretation, and otherwise uncharitable manglings) and actually apply it to solve a practical question of entropy.
Keiths could try to explain how he computes the change in his level of ignorance (aka “missing information”) when the ice melts. But shockingly he doesn’t seem capable without first resorting to what Lambert described as energy dispersal:
Also, from a retired chemistry professor (Ernest Z) who taught organic chemistry for 33 years:
https://socratic.org/questions/what-is-the-difference-between-entropy-and-enthalpy
ΔS = q_rev/T = (ΔH)/T
Walto:
In deference to you Walto, I will respond to DNA_Jock’s “corrections”.
But in deference to you, despite the fact that DNA_Jock says something as idiotic as this doesn’t really deserve many responses:
The way energy dispersal is defined by Lambert I already cited, but I’ll cite again. I can apply Lambert’s definition both to melting ice cubes (above) and mixing scenarios (done a buzzillion times repeatedly):
Sal,
The elephant is still in the room.
See what I mean, Jock?
Thanks, Sal, but to adequately respond to Jock, I think you’ll need to take him off ignore and answer the specific queries in his prior couple of posts. They seem to me quite marginal complaints, calling for minor elaborations or constraints, but I’m hardly an expert.
walto,
Concepts matter in science, even when angry old insurance regulators disagree. There’s a reason scientists didn’t split the difference with the phlogiston theorists and “go have some pbjs”. The phlogiston idea was wrong, so they got rid of it. Completely. Even if the angry old insurance regulators of the time disagreed.
Entropy is not a measure of energy dispersal. You have a hard time understanding why that matters. That’s understandable, because you describe yourself, accurately, as someone who is “scientifically ignorant” and who finds this stuff “complicated” and “difficult”.
Pedant got it right:
You may not understand why the concept of entropy matters — why it’s not just a quibble — but that’s to be expected from someone who is out of his depth.
Exactly what one expects from you here, keiths. Nothing less!
walto,
I quite understand. My aim is merely to demostrate that Sal shows no inkling that his understanding of these concepts goes beyond a High School introduction to entropy. My seemingly ‘marginal complaints’ are designed to guide him towards a better understanding, were he to actually engage: what is the get-out-of-jail card that allows dispersalists to square the almost-identical-gas mixing paradox; and experimentally determined heat capacities are a poor basis for understanding anything.
In this context, drawing an equivalence between dQ/T and dispersalism is letting him of the hook.
And yes, I recognize that
p(hell freezes over) > p(Sal engages) ~= P(keiths admits…)
I think I left my copy of Don Quixote in my coat pocket…
Ok, in deference to Walto, I’ll respond a little since some readers are wanting to see loose ends tied up and since DNA_Jock unlike Keiths and Mung has chemistry background.
If ∂ΔH/∂T = ΔCp = non-zero, A and B are not the same scenario so why would you expect entropies to be the same?
Thermochemistry laws apply to reaction with products and reactants, which really doesn’t apply to H2O ice at 250K that stays H2O ice at 250K!
Clearly wrong, since in this case if “ΔH will vary linearly with T”
∂ΔH/∂T = ΔCp = 0 (another way of saying the second derivative is 0)
thus
∂H/∂T = Cp = constant
and entropy change in this case is
ΔS = Cp Integral[ ln (T_final/ T_initial) ]
Now consider….
vs.
Clearly ΔS_1 is not equal to ΔS_2.
Even though in both scenarios ΔH and Cp are the same, ΔS is not, therefore ΔS is temperature dependent which shows a contradiction in your premise, therefore your premise is non-sensical!
Because of the math, only a few of the readers might have realized you just made a non-sensical statement. But you know it now, don’t you?
🙂 🙂
So now’s your turn to do the calculation Keiths and Mung can’t do with “missing information” procedures.
Compute entropy change of 20gram ice cube melting. And don’t use dQ/T since you said it is rarely informative and because it isn’t defined in terms of missing information.
For walto’s and colewd’s benefit, I will try to walk through Sal’s mangled prose.
No, Sal. You seem to have confused the symbols ∂ and Δ.
If you (Sal) assume that the entropy change (ΔS) is temperature-independent, then “ΔH will vary linearly with T”. Another way of saying this would be to note that “∂ΔH/∂T = ΔCp”, where ΔCp is constant, but non-zero. This is high school math: if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”; if ∂y/∂x = 0 then “y is independent of x”. Second derivatives don’t enter into it: ΔH is NOT a derivative, it’s the difference in enthalpies between reactants and products (water and ice in this case). Egads!
You are correct that A and B are not the same scenario, they are two different ways of using Clausius to calculate ΔH for the same scenario: a phase transition that occurs at a non-standard temperature. And they differ. This much you got right. But Kirchhoff is merely a specific case of Hess’s Law, which most certainly applies to phase transitions; it’s a restatement of the First Law.
Not sure how much point there is in Fisking the rest of your comment, but I will note
You haven’t even managed to get to the point where scenarios A and B differ, which is
ΔCp is a function of temperature.
More importantly, you entirely misconstrued my comment. I was illustrating an example that is problematic for your simple “plug and play” “∂Q/T is all that matters” Clausian view of entropy, and I was asking you (Sal) how you (Sal) would handle this problem using your Clausian approach: you could do A, or B or C.
And yes, you (Sal) would get “non-sensical” results.
As I asked in my previous post :
How many different answers will you get and, most importantly, WHY?
P.S. Have you figured out the get-out-of-jail card that allows dispersalists to square the almost-identical-gas mixing paradox yet? I don’t think you’re going to like the answer.
P.P.S. Why does a mixture have entropy at absolute zero?
Thank you, Jock.
in the limit of small change, Δ is d (as in differential), and the partials (∂) derivative of a derivative is a second derivative, just as I said.
Your math is in error, and I showed it, change entropy is not the same for all temperatures, therefore change of entropy is not independent of temperature, contrary to your nonsense.
Show for the readers the change of entropy for the same ΔH at 200K and then at 249K.
vs.
Do I have to spoon feed you the fact
ln (201/200) does not equal ln (250/249) ??
I just showed this can’t be true in general.
Do I have to spoon feed you elementary math too? Get it through your head, even with the same ΔH (one that varies linearly with T):
ln(201/200) does not equal ln(250/249)
or shorthand
ln(201/200) != ln(250/249)
So how can entropy change be invariant with temperature with the same amount of heat added to each scenario? It can’t. You got your math wrong.
Go ahead, show entropy change at various temperatures for the same amount of Enthalpy Change ΔH, you won’t get the same ΔS.
You don’t like the temperatures I’ve chosen, pick your own, but make sure ΔH is the same for each scenario, and you’ll still be in error.
Don’t bother asking more stupid questions until you fix the incoherence in your questions.
Sal, read to the end of my comment before responding…
Oh dear.
DNA_Jock,
I see your analogy and have studied General Relativity. General Relativity has been experimentally validated and the model can be used to calculate predictive values of space time curvature. How can Boltzman’s equations be used so we can better understand the thermodynamics of a melting ice cube?
I just don’t get the value of measuring ignorance.
colewd:
Do you understand why the concept of entropy is valuable?
BULL!
for B, I pointed out:
ΔCp = non-zero
Fro A, “ΔH will vary linearly with T”
∂ΔH/∂T = ΔCp = 0
which implies
Cp = constant
I got the point, you just can’t bring your self to see that’s what I wrote!
In scenario A, Cp is constant, but that won’t work because this leads to absurdities like saying
ln(201/200) = ln(250/249)
which is false. You math is in error.
Sal, I have the sense that you and Jock are so close at this point that if you went to lunch and brought a tablet and pen you could iron your differences out in minutes.
I think most of the trouble is the medium. It encourages cryptic ticklers like Jock’s “get-out-of-jail card.” And there’s the endless repetition (which may or may not be for the benefit of those imagined to have just awakened from a three-month doze). There are language issues too, and, of course there is the bad blood/history, which, presumably, is the reason for most of Mung’s pot-shotting. It’s too bad.
Hence, lunch!
As the analogy shows, you are asking the wrong question.
Boltzmann/Gibbs has been experimentally validated (this paper is a particularly fun example of the failure of naive dispersalism)
In the analogy,
Clausius = Newton
Boltzmann/Gibbs = Einstein
Clausius is a satisfactory approximation for melting ice (at Standard Temperature and Pressure, heh), and Newton is a satisfactory approximation for Apollo 11 or the flight of terrestrial cannon-balls.
However, if you are designing a GPS system or studying dipole traps, then Newton or Clausius will set you wrong.
Asking “How can Boltzman’s equations be used so we can better understand the thermodynamics of a melting ice cube?” is analogous to asking
“How can Einstein’s equations be used so we can better understand the flight of a terrestrial cannon-ball?”
The answer in both cases is “With great difficulty. So what?”
Are you still talking to me while you have me on Ignore?
walto,
‘Fraid not walto. I’m never going to be on the same planet as someone who can repeat
∂ΔH/∂T is non-zero, and equal to ΔCp (i.e. the difference in heat capacities between the reactants and products.
Heat capacity is a function of temperature (Sal does not appear to have ever learnt this…). This means that ΔCp may or may not be a function of temperature
(imagine that the reactants and products have heat capacities of {r + a.lnT} and {p + a.lnT} respectively; they both vary with temperature, but their difference does not).
Under the assumption that ΔCp is constant, then “ΔH will vary linearly with T”. Sal is failing high school math…if ∂y/∂x is a non-zero constant, then we say “y varies linearly with x”.
He actually is confusing the symbols ∂ and Δ.
seconded.
Careful Jock. You’ll end up back on Ignore.
I like this guy!
Entropy, S, is a measure of the disorder or randomness of a system.
An ordered system has low entropy. A disordered system has high entropy.
I was brought up to believe that there is no such thing as a stupid question.
On the other hand, there are plenty of curious idiots.
What depresses me is the high number of incurious idiots.
Today of all days. Sigh.
Curious Idiot. I like it!
Meh, I think if you explained this to him in person he’d get it. And you’d probably have a better sense where he’s coming from.
Even people who’ve gotten stuff wrong on high school math tests need not be thought to be from other planets. I have a daughter in high school, and some of that stuff is really hard!
DNA_Jock,
See the derivation of dS here:
https://engineering.ucsb.edu/~shell/che110a/heatcapacitycalculations.pdf
dS = Cp dT/ T
which implies
dS/dT = Cp/T
which means entropy change is dependent on temperature if ΔH changes linearly with T.
dS/dT = Cp/T
falsifies this absurdity by you:
You got caught making a mistake, and you won’t back down. I’ll keep reminding you of it.
No point since your first option was wrong, namely:
look at dH/T = Cp dT/T
which implies
dH = Cp dT
this will be a linear relationship if Cp is a constant, and the letter “C” often means constant, and Cp is in fact:
∂H/∂T = Cp
as stated here
https://en.wikipedia.org/wiki/Heat_capacity
dH = Cp dT
implies the linear relationship
ΔH = Cp ΔT
which looks a lot like
y = mx + b
where
y is ΔH
m = Cp
x = T
b= 0
given all these facts,
dS/dT = Cp/T
which falsifies your claim. Go ahead DNA_jock, do a Keiths and keep repeating yourself, it won’t change the fact your math is wrong and this statement is absurd:
Wrong, entropy change for “ΔH will vary linearly with T” implies entropy is temperature-DEPENDENT not independent. It’s right here:
dS/dT = Cp/T
The change in entropy (dS) with respect to change in temperature (dT) is dependent on temperature (T).
No amount of verbosity on your part will change that. I’ve provided ample links supporting my derivation, and you only reference yourself for the most part.
***************************************************
Hmmmmm.
IS vs. May Or May Not Be
I think It may be time to procure seconds and start talking about how many paces until one may fire one’s pistol.
Sorry walto,
But Sal still hasn’t figured out that my three options were options that Sal could potentially use to do his Sal-calculation, nor has he fathomed the difference between ∂H/∂T and ∂ΔH/∂T, or that Cp is not a constant. Apparently he thinks it is constant, because it starts with the letter “C”.
You really couldn’t make this stuff up.
If Cp is a constant what is delta Cp?
It is if ΔH varies linearly with T.
Linear variation is described by:
y = mx + b
where m is a constant
let
Thus,
y = mx + b
becomes this
ΔH = CpT
therefore, Cp must be a constant. If Cp is a constant, ΔCp = 0, just as I said.
It is under option A where DNA_Jock said:
I showed linear variation of ΔH implies Cp is constant and thus ΔCp = 0.
ΔCp = non-zero for option B. DNA_Jock is conflating all 3 options, when I’ve provided ΔCp for option B as
and for option A as
What is the entropy of a tempest in a teapot?
Well, thanks for the positive thoughts anyway. To quote Rodney King during the LA riots in his honor, “Can’t we all just get along.”
To which I respond, “but this is TSZ”.
Anyone with requisite knowledge can examine the math and computations. I’ve tried to be verbose and give many examples to drive home my points.
This is an example of a tiresome misrepresentation:
Cp is constant if ΔH varies linearly with T (option A). That is his premise for option A, not mine.
Cp is not constant otherwise (as would be the case in general with option B).
DNA_Jock isn’t correcting me, he’s attributing ideas to me that I did not make nor intended to make. As long as he’s willing and eager to do that, we won’t cooperate.
Option A and Option B were not options in ways to calculate entropy but entirely different thermodynamic systems. I provided the derivations above demonstrating Option A implies a constant Cp.
walto:
A comment worth repeating:
Does it contain melting ice?
Mung,
While the Sal/Jock sideshow continues, perhaps we can make some progress in our discussion of the dice. Do you understand why you need an epistemic probability distribution over the microstates, not over the sums, in order to do the calculation you are attempting?
Another demonstration that Cp is constant for Option A.
” ΔH varies linearly with T”
follows the form
y = mx + b
where m is a constant. To emphasize m is constant, I will say m_constant.
let
ΔH = m_constant T + b
which if we adjust b it is
H = m_constant T + b
now the definition of Cp is
∂H/∂T
from
https://en.wikipedia.org/wiki/Heat_capacity
taking the partial derivative with respect to T
Cp = ∂H/∂T = m_constant
keiths,
As alredy explained (twice? thrice?) if one didn’t understand that this was a quibble (as you apparently still don’t) and one wnate to insist on the thermodynamic sense of your absurd Damon entropy calculations, one WOULD have to rewrite all the books to make them useless.
So your choice is clear–start rewriting or start to understand the entirely quibbling nature of your thousand posts on this thread.
Sorry, but it’s your blind dice-thrower event that’s the sideshow.
Let ‘er rip!
That’s a pitiful rationalization, even by walto standards.