In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

The basic biochemistry textbook I study from is Lehninger Prinicples of Biochemistry. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,

Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system, entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, Glossary

Laurence A. Moran, University of Toronto
H. Robert Horton, North Carolina State University
K. Gray Scrimgeour, University of Toronto
Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conflates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, fire and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the first conflation to be an error.

“Disorder” is an analogy for entropy, not a definition for entropy. Analogies are powerful but imperfect. When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.

Is Entropy Disorder?
Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga
2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very ordered — and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itself it rapidly proceeds to the disordered most probable state.

Ludwig Boltzmann

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

Boltzmann

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:
EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy
…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

Is the evolution of biochemistry texts decreasing fitness? A case study of pedagogical error in bioenergetics

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.
..
Lehninger, Nelson and Cox
….
Garrett and Grisham

Moran and Scrimgeour
….
Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

A few creationists got it right, like Gange:
Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy is simply a quantitative measure of what the second law of thermodynamics describes: the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly, entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

Entropy

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where
S = entropy
k = boltzman’s constant
W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where
delta-S = change in entropy
dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

1,720 thoughts on “In Slight Defense of Granville Sewell: A. Lehninger, Larry Moran, L. Boltzmann

  1. walto,

    Given all of the above, can you see why this…

    Again, it’s like two weight measurements. They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark. The correct measure is of OUR ignorance, science’s–not yours or mine–up to some ideal, unachievable limit (which is not Damon’s, because he’s left thermodynamics and entered Laplaceville).

    …is wrong and also why it clashes with what Jaynes wrote?

  2. colewd:

    Can you reconcile this with the second law of thermodynamics?

    keiths:

    Yes. The second law simply says that the entropy of an isolated system must either remain the same or increase. That’s true for Xavier, Yolanda, and even Damon.

    For Damon the entropy is always zero, so it falls into the “remains the same” case.

    colewd:

    Can you state this with an example without using the word entropy?

    What would be the point? The Second Law is about entropy.

  3. petrushka, to walto:

    What non-operational definition? Thought experiments are not definitions. In the cases I’m discussing, they are extrapolations of formulas to extreme conditions. A concept that doesn’t work at extreme conditions may be useful, but incomplete.

    Exactly. I’m not sure why walto is having such trouble with this.

  4. walto:

    It [keiths’s view of entropy] leaves the term “information” unconstrained and so, useless.

    That’s confused. What’s relevant to an entropy calculation is any information that helps us narrow down the exact microstate of the system.

    Thus, your earlier examples of “how much Beverly enjoys some particular macrostate, or how much that state resembles some macrostate on Jupiter” are not relevant when calculating the entropy.

  5. Statistical mechanics, as pointed out in Chapter I, is a systematic way of guessing, making use of incomplete information.

    – Katz, Amnon. Principles of Statistical Mechanics: The Information Theory Approach. 1967. p 39.

    Now does Salvador understand “where is the information in that’?

  6. keiths:
    walto:

    That’s confused.What’s relevant to an entropy calculation is any information that helps us narrow down the exact microstate of the system.

    Thus, your earlier examples of “how much Beverly enjoys some particular macrostate, or how much that state resembles some macrostate on Jupiter” are not relevant when calculating the entropy.

    You actually have no idea whetrher those constrain the microstates from Damon’s point of view. But you are coming around to my way of thinking if you want to exclude them as irrelevant to Thermodynamics. That’s the point. Welcome!

  7. Equilibrium is a situation in which only constants of the motion are known. As already remarked (Chapter V, Section 4), all predictions in this situation are time-independent. Equilibrium is thus a stationary situation in which neither information nor predictions vary with time.

    – Katz. p. 59

  8. keiths, to walto:

    You’re placing an undue amount of importance on that last .00…01, don’t you think?

    Mung:

    Is that a picobit?

    It depends on the number of zeros. See the ellipsis?

    I wasn’t aware that there could be fractions of a bit. Learn something new every day.

    When you log reduce probabilities, you’ll get fractional units. Interconverting among bits, trits, nats, hartleys, etc., has the same consequence.

  9. keiths:

    Thus, your earlier examples of “how much Beverly enjoys some particular macrostate, or how much that state resembles some macrostate on Jupiter” are not relevant when calculating the entropy.

    walto:

    You actually have no idea whetrher those constrain the microstates from Damon’s point of view.

    Damon already knows the exact microstate, so any further constraints are irrelevant to him.

    But you are coming around to my way of thinking if you want to exclude them as irrelevant to Thermodynamics. That’s the point. Welcome!

    You’re welcoming me to a position I’ve occupied since before the thread began? Um, thanks, I guess.

  10. keiths, I don’t believe 1/10th of a bit anything you say. 😉

    But you still score far better than Salvador, lol.

  11. keiths:
    keiths:

    walto:

    Damon already knows the exact microstate, so any further constraints are irrelevant to him.

    You’re welcoming me to a position I’ve occupied since before the thread began?Um, thanks, I guess.

    Ok then say it with me loudly so we’ll all know you mean it:

    Not all information is relevant to thermodynamics (whether 1, 2, n, or the conjunction of all of them)! What is thermodynamic information depends on the science and what is being investigated! Damon’s knowledge, which is not so constrained and thus always produces an entropy of zero is tgerfore irrelevant to thermodynamics, which is always so constrained! Damon’s calculations thus produce correct tthemodynamic entropies only by accident!

    Again!

    Louder!!

  12. walto,

    When you get worked up like this, you just compound your confusion. Try to calm yourself down so you can think things through.

    Damon’s knowledge of the exact thermodynamic microstate is thermodynamic information — obviously. There is no gap between the thermodynamic information associated with Damon’s macrostate and the thermodynamic information associated with the system’s microstate. No gap means no missing thermodynamic information, and no missing thermodynamic information means that the thermodynamic entropy is zero.

    It’s not that hard.

  13. In classical thermodynamics the function whose differential is dQ/T is called “entropy.” In Eq. (12.10) this function is the missing information I. Thus, at equilibrium, the missing information is identical with the thermodynamic entropy.

    – Katz. p. 84

  14. The information theory approach is not a miracle device to arrive at a solution of any statistical problem. It is only a general way to formulate the problem itself consistently. It most often happens that the “best guess” coincides with the educated guess of those who practice guessing as an art. The mathematical problem that results is therefore also the same. For this reason it may appear to a person concentrating on any one problem in statistical mechanics that the information theory approach contributes nothing toward the solution of the problem. What it does contribute is the unification of all statistical mechanics into one consistent and well-defined theory.

    Katz, Amnon. Principles of Statistical Mechanics: The Information Theory Approach. 1967. p. 13.

  15. keiths,

    I figured you didn’t actually get it. Or if you did, you lost grasp after a second. Ah well, we can all enjoy thinking that you understood what everybody has been telling you all thread, even if it was only knowing*, and only for a split second.

  16. walto: To put it in Jock’s terms–Maxwell demon–Yes; Laplace demon (i.e. yours)–No.

    We will see that the whole idea of Maxwell’s demon cannot be discussed within thermodynamics.

    – Ben-Naim (2015)

  17. keiths,

    Can you state this with an example without using the word entropy?

    What would be the point? The Second Law is about entropy.

    Does everyone agree that the second law is about entropy? Entropy defined by who? What is the experimental support for the second law?

  18. colewd: Does everyone agree that the second law is about entropy?

    Yes. Do you have a formulation of the second law in which the second law is not about entropy?

    Entropy defined by who?

    By the people who formulated the second law.

    What is the experimental support for the second law?

    Start with the observations of the pioneers and move on from there.

  19. As we have seen for the zeroth and first laws, the formulation and interpretation of a law of thermodynamics leads us to introduce a thermodynamic property of the system: the temperature, T, springs from the zeroth law and the internal energy, U, from the first law. Likewise, the second law implies the existence of another thermodynamic property, the entropy (symbol S). To fix our ideas in the concrete at an early stage it will be helpful throughout this account to bear in mind that whereas U is a measure of the quantity of energy that a system possesses, S is a measure of the quality of that energy: low entropy means high quality; high entropy means low quality. We shall elaborate this interpretation and show its consequences in the rest of the chapter. At the end of it, with the existence and properties of T, U, and S established, we shall have completed the foundations of classical thermodynamics in the sense that the whole of the subject is based on these three properties.

    A final point in this connection, one that will pervade this chapter, is that power in science springs from abstraction. Thus, although a feature of nature may be established by close observation of a concrete system, the scope of its application is extended enormously by expressing the observation in abstract terms. Indeed, we shall see in this chapter that although the second law was established by observations on the lumbering cast-iron reality of a steam engine, when expressed in abstract terms it applies to all change. To put it another way, a steam engine encapsulates the nature of change whatever the concrete (or cast-iron) realization of that change. All our actions, from digestion to artistic creation, are at heart captured by the essence of the operation of a steam engine.

    Peter Atkins. The Laws of Thermodynamics: A Very Short Introduction. Kindle Edition.

  20. The fourth of the laws (the `third law’) has a more technical role, but rounds out the structure of the subject and both enables and foils its applications. Although the third law establishes a barrier that prevents us from reaching the absolute zero of temperature, of becoming absolutely cold, we shall see that there is a bizarre and attainable mirror world that lies below zero.

    Peter Atkins. The Laws of Thermodynamics: A Very Short Introduction. Kindle Edition.

    Interesting. Apparently Damon’s knowledge of the exact microstate disappears at zero degrees. Now if that’s not counter-intuitive, what would be?

  21. Mung,

    Interesting. Apparently Damon’s knowledge of the exact microstate disappears at zero degrees.

    How did you get that idea from the Atkins quote?

  22. Earlier, at walto’s request, I drew up a list of points upon which he and I have disagreed during this thread. It may be a useful reference for people who are following the discussion, so I’ll reproduce it here.

    I assert that:

    1. Entropy is not a measure of energy dispersal.

    2. Entropy is a measure of missing information regarding the microstate.

    3. Entropy is observer-dependent.

    4. Entropy is not a function of the system alone.

    5. Entropy is a function of the macrostate.

    6. The entropy is zero for an observer who knows the exact microstate.

    7. The second law is not violated for such an observer.

    8. Stipulating an observer-independent entropy forces it to be zero for everyone, rendering it useless for entropy comparisons.

    9. The ‘missing information’ interpretation of entropy works in all cases.

    10. Any information that helps us narrow down the exact microstate is “admissible” when calculating entropy.

  23. keiths: How did you get that idea from the Atkins quote?

    Atkins: …the third law establishes a barrier that prevents us from reaching the absolute zero of temperature, of becoming absolutely cold…

    Let’s assume he’s speaking of a physical barrier and not a psychological barrier. Is that a reasonable assumption?

    Let’s assume that this barrier is not simply a matter of what is more or less probable, that the third law doesn’t merely state that it’s just extremely unlikely.

    Then we are left to suppose that Damon has knowledge of a state that is not physically possible. Likewise, a state that is not thermodynamically possible.

    This state, were he to have knowledge of it, would not be a thermodynamic state. Damon’s information would not be “thermodynamic information.” For the very laws of thermodynamics prohibit just such a state.

    Where is the flaw in that line of reasoning?

  24. keiths: Earlier, at walto’s request, I drew up a list of points upon which he and I have disagreed during this thread. It may be a useful reference for people who are following the discussion, so I’ll reproduce it here.

    It would perhaps be helpful if you could make it explicit just where you are talking about thermodynamic entropy.

    For example, if you were to assert that Shannon entropy is not a measure of energy dispersal, I can’t imagine that anyone would disagree.

  25. keiths: 2. Entropy is a measure of missing information regarding the microstate.

    Apply this to the universe. What is the microstate of the universe? Calculate the “missing information regarding the microstate” of the universe.

    3. Entropy is observer-dependent.

    Apply this to the universe. Why will different observers come up with different values for “the entropy of the universe”?

    4. Entropy is not a function of the system alone.

    Apply this to the universe. What other system is there?

    5. Entropy is a function of the macrostate.

    Apply this to the universe. What is the macrostate of the universe? What is the “function of the macrostate” that you believe is applicable to the universe?

    keiths, your concept of entropy is meaningless when applied to the universe. I hope you can see that now.

  26. keiths: 9. The ‘missing information’ interpretation of entropy works in all cases.

    Where’s the mystery in that? By definition, the entropy is the missing information.

  27. keiths: 10. Any information that helps us narrow down the exact microstate is “admissible” when calculating entropy.

    Given that three different “observers” can come up with three different values for the entropy, and the values they come up with are all correct, who cares?

    Assume an infinite number of observers, each with an infinitismal amount of different information that helps them narrow down the exact microstate, all of which is equally admissible when calculating the entropy.

    A veritable infinity of entropies! How does that help “narrow down” anything?

  28. hi walto,

    I think I understand now.

    Damon_1 knows one microstate that is not the exact microstate. Damon_2 knows two microstates that are not the exact microstate. Damon_3 knows three microstates that are not the exact microstate. Damon_n knows n microstates that are not the exact microstate.

    All this information is “admissible.”

  29. Poor Mung. You’re helpless when you aren’t just cribbing from Ben-Naim or some other author.

    Suppose I weren’t here to help you. How would you go about testing the validity of your claim below?

    Apparently Damon’s knowledge of the exact microstate disappears at zero degrees.

  30. keiths: Suppose I weren’t here to help you.

    Suppose I weren’t hear to help you!

    1. You might still think some thermodynamic processes aren’t actually thermodynamic processes.

    2. You might still think Shannon Information is a measure of colloquial information, or worse, that Shannon Information is no different from colloquial information.

    3. You might still think one can speak meaningfully about “the entropy of the universe.”

    4. You might still think entropy is a function of time.

    Keep it up and I may have my own “top ten” list.

  31. Don’t worry, Mung. I’m not going to abandon you.

    But I do want you to at least try. Suppose I weren’t here to help you. How would you go about testing the validity of your claim below?

    Apparently Damon’s knowledge of the exact microstate disappears at zero degrees.

    Think about the explanation you gave, and then look at your claim again.

  32. keiths: How would you go about testing the validity of your claim below?

    I wouldn’t go about testing the validity of that claim, because it’s pretty much irrelevant to the point I was making. The point I wanted to make can be found in the actual argument that I presented, which, by the way, was not cribbed from Ben-Naim or anyone else.
    I suppose that you think that if the laws of thermodynamics prohibit such a state, that Damon’s knowledge of the state is in fact “thermodynamic information.”

    Did I get that right?

  33. I wouldn’t go about testing the validity of that claim, because it’s pretty much irrelevant to the point I was making.

    Mung, everyone can see when you’re deflecting. 🙂

  34. keiths: Mung, everyone can see when you’re deflecting.

    You can refute the argument or you can’t. If it’s your position that “it’s all thermodynamics” just say so.

  35. Mung,

    Okay, you’re clearly not going to figure this out on your own, so I’ll explain it.

    You said:

    Apparently Damon’s knowledge of the exact microstate disappears at zero degrees.

    Then you said that absolute zero can’t be reached.

    So by your own logic, Damon’s knowledge of the exact microstate doesn’t disappear, because the conditions under which it would disappear are never met.

  36. keiths: So by your own logic, Damon’s knowledge of the exact microstate doesn’t disappear, because the conditions under which it would disappear are never met.

    Conversely, the condition under which it would appear are never met.

    Damon’s knowledge of the exact microstate doesn’t appear, because the conditions under which it would appear are never met.

  37. keiths: Damon always knows the exact microstate, Mung.

    Of course. What was I thinking? Is there anything Damon doesn’t know? Is there anything Damon cannot know?

  38. Mung,

    Of course. What was I thinking?

    Hell if I know. You’re lost without someone to crib from.

  39. walto,

    Given all of the above, do you finally accept that entropy is observer-dependent? Setting Damon aside for the moment, do you understand that both Xavier and Yolanda are correct about the entropy of the system, and that it is false to say that

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    …?

  40. No. I disagree with your position on that. I’ve said why a number of times. And I’ve heard your “reasons” many more times–perhaps 50 by now. You can post them again if you like but, they’ll still be wrong and largely question begging, and I won’t be reading them anymore.

  41. walto,

    Your position is inconsistent. You’ve written:

    I think there are, as Jaynes says, many different correct values for entropy. This depends on what variables are considered relevant, what is being looked for.

    That fits nicely with the Xavier/Yolanda situation. Isotopic concentration is relevant for her but not for him. They see different correct entropy values.

    You contradict that by writing:

    They may both be “correct” as far as they go, but one is more ACCURATE than the other; thus, in another sense closer to being “correct.”

    When it comes to thermodymics, Yolanda is “getting there” as compared with Xavier; Damon may have passed the mark.

    Which is it?

  42. keiths: You’re lost without someone to crib from.

    I seem to have selected some rather good sources though. I suppose you could just attribute that to pure dumb luck.

    🙂

  43. Mung,

    Second Law: Heat Engines

    Second Law of Thermodynamics: It is impossible to extract an amount of heat QH from a hot reservoir and use it all to do work W . Some amount of heat QC must be exhausted to a cold reservoir. This precludes a perfect heat engine.

    This is sometimes called the “first form” of the second law, and is referred to as the Kelvin-Planck statement of the second law.

    Here is a description that does not involve the word entropy. The issue with the word entropy is it has multiple definitions. The above description can be tested.

  44. mung,

    Here are more definitions of entropy, some of which this group does not agree with.

    Entropy: a state variable whose change is defined for a reversible process at T where Q is the heat absorbed.

    Entropy: a measure of the amount of energy which is unavailable to do work.
    Entropy: a measure of the disorder of a system.
    Entropy: a measure of the multiplicity of a system.
    Since entropy gives information about the evolution of an isolated system with time, it is said to give us the direction of “time’s arrow” . If snapshots of a system at two different times shows one state which is more disordered, then it could be implied that this state came later in time. For an isolated system, the natural course of events takes the system to a more disordered (higher entropy) state.

    The first question is when you put the word entropy into the second law description do you have a chance of the law being clear>

  45. : obtaining the entropy of an ideal gas

    It should be noted that this approach is superior to both Clausius’ definition and the Boltzmann definition of entropy. In contrast to Clausius’ definition, which provides only differences in entropy, the SMI approach provides the entropy function itself. Unlike the Boltzmann definition, which requires a lengthy calculation of the number of energy levels (W) of an ideal gas, then calculating the entropy, the SMI approach provides the entropy function directly, without going through the calculation of W.

    – Ben-Naim (2015)

    Sal’s complaints about calculating the entropy using the Boltzmann formula just miss the point entirely.

  46. : Entropy and the Second Law

    With the new concept of entropy one could proclaim the general overarching formulation of the Second Law. In any spontaneous process occurring in an isolated system, the entropy never decreases.

    The Second Law is unexplainable within the context of the macroscopic theory of matter.

    In modern thermodynamics one summarizes the Second Law with the statement that there exists a state function, denoted as S and referred to as entropy, which in any spontaneous process occurring in an isolated system always increases. A state function means that when the thermodynamic state is defined – say, by giving the temperature, pressure and composition – the entropy of the system is also defined.

    – Ben-Naim (2015)

Leave a Reply