The basic biochemistry textbook I study from is *Lehninger Prinicples of Biochemistry*. It’s a well regarded college textbook. But there’s a minor problem regarding the book’s Granville-Sewell-like description of entropy:

The randomness or disorder of the components of a chemical system is expressed as entropy,Nelson, David L.; Cox, Michael M.. Lehninger Principles of Biochemistry (Page 23). W.H. Freeman. Kindle Edition.

Gag!

And from the textbook written by our very own Larry Moran:

enthalpy (H). A thermodynamic state function that describes the heat content of a system,

entropy (S). A thermodynamic state function that describes the randomness or disorder of a system.

Principles of Biochemistry, 5th Edition, GlossaryLaurence A. Moran, University of Toronto

H. Robert Horton, North Carolina State University

K. Gray Scrimgeour, University of Toronto

Marc D. Perry, University of Toronto

Choke!

Thankfully Dan Styer almost comes to the rescue as he criticized ID proponent Granville Sewell:

On 27 December 2005, Professor Granville Sewell’s essay “Evolution’s Thermodynamic Failure” appeared in an opinion magazine titled The American Spectator. The second paragraph of that essay conﬂates “entropy”, “randomness”, “disorder”, and “uniformity”. The third paragraph claims that “Natural forces, such as corrosion, erosion, ﬁre and explosions, do not create order, they destroy it.” Investigation of this second claim reveals not only that it is false, but also shows the ﬁrst conﬂation to be an error.

…

“Disorder” is an analogy for entropy, not a deﬁnition for entropy. Analogies are powerful but imperfect.When I say “My love is a red, red rose”, I mean that my love shares some characteristics with a rose: both are beautiful, both are ethereal, both are desirable. But my love does not share all characteristics with a rose: a rose is susceptible to aphid infection, my love is not; a rose should be fed 6-12-6 fertilizer, which would poison my love.Is Entropy Disorder?

Dan Styer criticizing Granville Sewell

Almost well said. Entropy is often correlated with disorder, but it isn’t the same thing as disorder. Shoe size is correlated with reading ability (i.e., small shoe size suggests we’re dealing with a toddler, therefore non-reader). Perhaps “analogy” is too generous a way to describe the mistake of saying entropy is disorder. But it’s a step in the right direction.

Styer’s comment leads to this rhetorical challenge by physicist Mike Elzinga to creationists:

Where are the order/disorder and “information” in that? Creationists need to explain where Clausius’s coining of the word entropy means that everything tends to disorder and decay.

Mike Elzinga

2lot trouble

Hear, hear.

But to single out creationists for equating entropy with disorder? How about biochemists like Lehninger (or the authors carrying Lehninger’s torch), or Larry Moran or the founder of statistical mechanics Ludwig Boltzmann himself who suggested entropy is disorder.

“In order to explain the fact that the calculations based on this assumption [“…that by far the largest number of possible states have the characteristic properties of the Maxwell distribution…”] correspond to actually observable processes, one must assume that an enormously complicated mechanical system represents a good picture of the world, and that all or at least most of the parts of it surrounding us are initially in a very

ordered— and therefore very improbable — state. When this is the case, then whenever two of more small parts of it come into interaction with each other, the system formed by these parts is also initially in an ordered state and when left to itselfit rapidly proceeds to the disordered most probable state.”

Boltzmann used a very poor choice of words. He (ahem) made a small mistake in a qualitative description, not a formal definition.

Chemistry professor Frank Lambert comments about Boltzmann’s unfortunate legacy of “disordered most probable states”:

That slight, innocent paragraph of a sincere man — but before modern understanding of q(rev)/T via knowledge of molecular behavior (Boltzmann believed that molecules perhaps could occupy only an infinitesimal volume of space), or quantum mechanics, or the Third Law — that paragraph and its similar nearby words are the foundation of all dependence on “entropy is a measure of disorder”. Because of it, uncountable thousands of scientists and non-scientists have spent endless hours in thought and argument involving ‘disorder’and entropy in the past century. Apparently never having read its astonishingly overly-simplistic basis, they believed that somewhere there was some profound base. Somewhere. There isn’t. Boltzmann was the source and no one bothered to challenge him. Why should they?

Boltzmann’s concept of entropy change was accepted for a century primarily because skilled physicists and thermodynamicists focused on the fascinating relationships and powerful theoretical and practical conclusions arising from entropy’s relation to the behavior of matter. They were not concerned with conceptual, non-mathematical answers to the question, “What is entropy, really?” that their students occasionally had the courage to ask. Their response, because it was what had been taught to them, was “Learn how to calculate changes in entropy. Then you will understand what entropy ‘really is’.”

There is no basis in physical science for interpreting entropy change as involving order and disorder.

So in slight defense of Granville Sewell and numerous other creationists from Henry Morris to Duane Gish to Kairos Turbo Encabulator Focus for equating entropy with disorder, when we ask “where did they get those distorted notions of entropy?”, we need look no further than many biochemists and textbook authors from reputable universities, and unfortunately one of the fathers of statistical mechanics, Ludwig Boltzmann himself!

Here is a list of chemistry books that treat entropy correctly:

EntropySite.Oxy.Edu

April 2014

The 36 Science Textbooks That Have Deleted “disorder” From Their Description of the Nature of Entropy

…..

Lehninger and Moran’s books aren’t on that list. Their books however did make the list of biochem books judged by some Australians as decreasing in fitness:

The initial impetus for this research was the discovery by the authors of a variety of common and consistent errors and misconceptions in pedagogical literature on the topic of thermodynamics in Biochemistry. A systematic survey was undertaken of material on thermodynamics in Biochemistry textbooks commonly used in Australian Universities over the period from the 1920s up to 2010.

..

Lehninger, Nelson and Cox

….

Garrett and Grisham

…

Moran and Scrimgeour

….

Jeremy M. Berg, John L. Tymoczko, Lubert Stryer in 2002

…

A few creationists got it right, like Gange:

Entropy and Disorder from a Creationist Book Endorsed by a Nobel Laureate

Here is chemistry professor Frank Lambert’s informal definition of entropy:

Entropy issimply a quantitative measure of what the second law of thermodynamics describes:the dispersal of energy in a process in our material world. Entropy is not a complicated concept qualitatively. Most certainly,entropy is not disorder nor a measure of chaos — even though it is thus erroneously defined in dictionaries or pre-2002 sources.

The is the correct Ludwig Boltzman Entropy as written by Planck:

S = k log W

where

S = entropy

k = boltzman’s constant

W = number of microstates

Also there is Clausius:

delta-S = Integral (dq/T)

where

delta-S = change in entropy

dq = inexact differential of q (heat)

As Dr. Mike asked rhetorically: “Where are the order/disorder and “information” in that?”

My answer: Nowhere! Most creationists (except guys like me and Robert Gange) don’t realize it’s nowhere.

Mung:

The entropy of

the system,of course, as seen from the observer’s point of view.Now back to your claim:

Why not?

How many are required, and why?

Why not?

https://en.wikipedia.org/wiki/Thermodynamic_system

keiths, can you tell us what the values of the thermodynamic state variables are for your thermodynamic system consisting of one molecule?

For the reader’s benefit, it may be worth reviewing why Claussius and Boltzmann definitions are important even though they are described in totally different ways. Claussius is described in terms of heat and temperature, both of which are conceptually accessible. Boltzmann in contrast is described in esoteric concepts of 6-dimensional microstates or quantum mechanical microstates.

From the OP:

Taking some liberties, I can tie Boltzmann and Claussius this way:

delta-S = kB (ln W_final – ln W_initial) = Integral (dQ/T)

[fwiw, I’m now using a capital “Q” for “dQ” instead of “dq”, not that it really makes a difference in the ultimate scheme of things.]

What this enables us to do is use thermometers and calormiters to probe how molecules may or may not connect and how they move at the molecular level.

Linus Pauling was able to use the Claussius entropy to confirm the validity of a particular molecular model of ice and the hydrogen bonds of ice that had an associated theoretical Boltzmann entropy. The theoretical Boltzmann entropy was reasonably close to the measured Claussius entropy and hence supported Pauling’s model (which has since been confirmed).

At the time, they probably could not see individual molecules and connections. Maybe that is still the case today.

Most if not all of those diagrams of the molecular structure of ice are by inference, not usually by actual photographs of molecules in ice, and that would have definitely been true when Pauling published his paper on ice in 1935.

Diagrams like those below are not deduced from photos of ice molecules, but by inference. The Claussius integral’s equivalence to the Boltzmann entropy equation makes it possible to use heat and temperature data to investigate molecular structure. Together with other techniques like x-ray crystallography or whatever, we can use heat and temperature data to elucidate things like the molecular structure of ice.

That’s why I took exception to this ridiculous comment from the DNA_Jock school of thermodynamics:

Personally I don’t find any of them to be conceptually inaccessible and neither does Salvador, as his light bulb moment with Lambert attests.

http://entropysite.oxy.edu/microstate/

Mung,

The single-molecule system has everything you need. You know the macrostate, which is compatible with an ensemble of microstates. You don’t know the exact microstate.

The entropy is a measure of the information gap — the additional information needed to pin down the exact microstate.

Mung,

We can choose exactly the same parameters as we did before: kinetic energy, volume, and number of particles. Think about it.

It fits perfectly with the ‘missing information’ view of entropy, but it doesn’t work at all with the ‘energy dispersal’ view.

The same can be said of tossing a coin.

Mung,

I’m still interested in whether you can support your claim:

Why not? Please present an argument.

If one molecule isn’t enough, then how many are required, and why?

keiths, I have posted a number of relevant quotes and links, and asking me for a specific number of molecules is a red herring.

Liken it to you arguing over the law of large numbers or the law of averages and me asking you for the specific number where the law kicks in.

Meanwhile I’m awaiting your measurements of the thermodynamic properties of your “thermodynamic system” consisting of one single particle.

Mung,

What you

haven’tposted is an argument in support of your claim:Do you have an argument, or do you concede that your claim is incorrect?

No, it’s not the same. The LLN is a tendency that becomes apparent as you increase the number of samples. Entropy is fully established with the very first molecule in the system. If you disagree, please present an argument — an actual argument — to that effect.

I already explained that the parameters are the same as what we were using earlier: kinetic energy, volume, and number of molecules. In this case the kinetic energy of the system is equal to the kinetic energy of the single molecule, because nothing else is available to carry any kinetic energy. That’s why the ‘energy dispersal’ view of entropy fails. The volume is the volume of the container, and the number of molecules is one.

There are zillions of microstates compatible with that macrostate.

Wis large. Plug it intoand you have the thermodynamic entropy in J/K.

This is true. But what’s good for the goose is good for the gander. Or so they say.

If I do not have an argument, it does not follow that my claim is incorrect. Just take some time to reflect on all the times you’ve been asked for your argument.

Still waiting for you to support that claim. If your only support for it is that I haven’t presented any argument against it, well …

I disagree. I’ll try to do a better job of making it clear why I disagree, as the posting of quotes and links in support of my position seems to have no effect.

What you have not done is give the

valuesof those parameters and how you arrived at those values. Do you understand that claiming your thermodynamic system has one molecule is insufficient to specify the system?So you counted all the microstates and came up with “zillionz”? And you defined the macrostate, how, exactly? And you determine that the “zillionz” of microstates are compatible with that macrostate how, exactly?

What are the relevant probability distributions? Can you at least answer that?

Oh, good. Progress. I thought you had claimed that the unit of thermodynamic entropy was the bit.

A microstate has no entropy – despite the fact that it has a defined energy! Entropy is solely associated with the macrostate…The concept of the microstate as such is supported by all the physical knowledge in our possession, so we can construct it in our mind. But we cannot realise it by obervation, as little as nature can manifest all its possible constellations.– Manfred Eigen

Mung, quoting Eigen:

Eigen is right. You’ve been struggling with the concepts of microstates and macrostates throughout the thread, Mung, so please focus on this. Once you understand what macrostates and microstates actually are, then you’ll be in a position to see that a system as simple as one helium molecule in a container can have an entropy.

This comment bears repeating:

Keeping the generality of “microstate” and “macrostate” in mind, we can make some observations:

1. The system is in exactly one microstate at a time.

2. We often don’t know the exact microstate. Whatever information we

dohave about it constitutes the macrostate.3. The entropy is the gap between the information we actually have — the macrostate — and the information that is required to specify the exact microstate.

4. A microstate can be thought of as a single point in “microstate space”.

5. A macrostate can be thought of as a region of microstate space, encompassing all of the microstates that are compatible with what is actually known about the system’s microstate.

6. A macrostate can be as large as the entire microstate space or as small as a single microstate. It depends on what you know. Entropy is observer-dependent.

7. If the microstates in question are thermodynamic, then the entropy in question is thermodynamic. Any information that narrows down the possible thermodynamic microstates is thermodynamic information.

8. If the microstates in question are

notthermodynamic, then the entropy in question isnotthermodynamic.Thermodynamic entropy is just a particular kind of Shannon entropy in which the microstates are thermodynamic.Have I now. You’re the one who introduced Damon, who knows the exact microstate.

When have I ever applied the concept of entropy to any

microstate?According to you, a simple flip of a coin has an entropy.

I have, throughout this thread, consistently attempted to maintain the distinction between Shannon entropy (aka information entropy) and thermodynamic entropy, a distinction which you have rarely, if ever, been willing to acknowledge.

Given a single molecule that can be in one of two “macrostate” locations V1 or V2, both of which are equally probable, one can calculate “the coin tossing entropy.”

As walto pointed out, a single helium molecule can have an entropy. But you’re not talking about the entropy of the molecule, you’re talking about the entropy of a some system, which, as far as I can tell so far, exists only in your mind. You’re offering up as evidence for your position a “thought experiment” consisting of the location of a single molecule located somewhere in a hypothetical volume, and that’s fine as far as it goes. But then you

insistthat this system is athermodynamic system, a claim that I find objectionable.The question is not whether one can calculate “an entropy,” the question is whether the amount calculated qualifies as “thermodynamic entropy.”

You’ve imagined a single molecule of helium in an as yet unspecified volume. You remove a constraint and doubled the volume and the thermodynamic entropy of the system has changed by how much, and why, and in what units?

keiths:

Mung:

Yes. Here, for example. You wrote:

Your mistakes there were that you:

1) Failed to recognize that Damon’s information about the microstate

ishis macrostate. As I noted above:2. Failed to recognize that entropy — a measure of missing information — can be zero if there is no missing information. Obviously.

We never know the exact microstate and if certain theories are correct we cannot know the exact microstate.

Your claim that our knowledge of the microstate constitutes the macrostate flies in the face of your prior claims in this thread and makes them circular.

What is the microstate of a pair of dice? What is the macrostate of a pair of dice? Calculate the entropy, which, according to you, is the difference between the two.

Oh wow. So in support of his claim that I “have been struggling with the concepts of microstates and macrostates throughout the thread,” keiths offer up his Damon.

It seems to have escaped his attention that I was demanding that he recognize the distinction between the two, something he still seems to be struggling with.

My position is that the microstate is not the macrostate.

The position of keiths seems to be that for certain Damons, the microstate is the macrostate. Or, perhaps the position of keiths is that for certain Damons, the information about the microstate is the same as the information about the macrostate.

Who knows.

Finally, statistical mechanics offers us the immense intellectual satisfaction of rendering transparent what thermodynamics leaves opaque; that is, statistical mechanics casts a brillian light of

interpretationthroughout most of the realm in which classical thermodyanmics can afford us little more than phenomenologicaldescription.– Nash, Leonard K.

Elements of Statistical ThermodynamicsPoor Salvador.

Mung,

I’d be happy to walk you through the calculations, but I expect you to do the work. I already showed you how it’s done in the card deck scenario, but to no avail. If you do the calculations this time, the lessons might stick.

If you get stuck, don’t worry. I’ll help you out.

To make it easier for you, let’s specify that one die is red and the other die is green, and that you aren’t colorblind. A microstate can then be specified in terms of which face is up on each die — for example, a ‘4’ on the red die and a ‘3’ on the green would be one possible microstate.

How many possible microstates are there, and what are they? Show your work.

keiths, if your name is Damon, the macrostates don’t matter.

4-3 is no different from 3-4, and neither of them is different from 1-1. They are all equally likely, and all that matters is the microstate. Once Damon knows the microstate, Damon knows the macrostate (absurd, but true).

Why one earth are your trying to introduce different colors? The probability distribution of a pair of fair die is well known, regardless of color, and even if you color the dice it does not change the fact that all the possible microstates are equally likely.

All you’ve done is give additional credence to the claim that you can’t get to the macrostate from the microstate.

Mung,

How many possible microstates are there, and what are they? Show your work.

Then we can move on to the next step.

For a pair of dice, or for a single molecule in a volume V1?

For a single die, there are 6 possible microstates. We’ll assume they are all equally likely. A question for you, what is the probability distribution?

For a pair of dice, there are 6^2 possible microstates. We’ll assume they are all equally likely. A question for you, what is the probability distribution?

Let’s assume that each of the 36 possible outcomes of a toss of a pair of dice is equally likely. Seems reasonable, yes? Given that each outcome has an equivalent probability, how does one derive the macrostate from the microstate? That’s like saying that every macrostate is equally likely, and yet we know that is wrong.

Now, how many possible microstates are there for your single molecule of helium in your as yet unspecified “thermodynamic system”? Show your work.

Mung:

Right. For each of the six possible values on the red die, there are six possible values on the green die, for a total of 36.

Now suppose the dice are fair and you are blindfolded. You give the dice a good shake and throw them onto a craps table. No one tells you anything about the result.

What is the entropy?

Mung,

Do you need a hint?

keiths,-information entropy or

-thermodynamic entropy

Either that or you’ll have to wait until I have time to do some more reading. 🙂

log2 36 = 5.17

colewd,

Dice entropy. Notice how we defined the microstate:

Mung,

Right. The macrostate is “a random throw of the two dice”, and all 36 possible microstates are compatible with that macrostate, so

Wis 36. They are equiprobable, so we can do a Boltzmann-style entropy calculation instead of the more complicated Gibbs:Now suppose that after you throw the dice, you get some feedback from an honest and accurate observer.

What is the entropy in each of the following cases?

a) She tells you that the sum of the numbers on the two dice is greater than eight.

b) She tells you that the number on the green die is one.

c) She tells you that the sum is twelve.

Then she asks you how much her head weighs.

I hope you’re following along, walto. This is a chance for you to understand the observer-dependence of entropy, if you haven’t already quietly changed your position.

You kidding? I wouldn’t miss this show for the world!

The macrostate is the state of a macroscopic material system, depending on certain macroscopically measurable variables. In relation to the microstate it is a construction of our mind; it does not exist as such and therefore cannot be “observed”.

– Manfred Eigen

I believe the question to be misguided, but my answer is the same to all three scenarios.

The

entropyis 3.27 bits/symbol. And it doesn’t depend on your honest and accurate observer, it depends on the probability distribution.Thermodynamics is a funny subject. The first time you go through it, you don’t understand it at all. The second time you go through it, you think you understand it, except for one or two small points. The third time you go through it, you know you don’t understand it, but by that time you are so used to it, it doesn’t bother you anymore.– Arnold Sommerfeld

Mung,

You didn’t hesitate to answer it for the first of the four scenarios, in which you got no feedback about the result of the dice throw. What makes it “misguided” when you are getting feedback?

That isn’t consistent with your original answer:

Hint: You got the correct answer because you took the log of

W, the number of possible microstates compatible with the macrostate.Win this case was 36 because you received no feedback about the results of the throw. That is, the macrostate was simply “a random throw of the two dice”, and each of the 36 microstates was a live possibility.What is

Win each of the following scenarios?a) You throw the dice and your friend tells you that the sum of the numbers on the two dice is greater than eight. How many microstates are compatible with that macrostate?

b) She tells you that the number on the green die is one. How many microstates are compatible with that macrostate?

c) She tells you that the sum is twelve. How many microstates are compatible with that macrostate?

ok, so my first answer was probably wrong. 🙂

Also, as I stated previously:

Where on earth are you getting your macrostates, because you aren’t getting them from the possible microstates.

Here’s what you wrote earlier:

Given a pair of dice we do know the exact microstates that are possible. So I ask again, what’s the macrostate of a pair of dice?

Is there some reason I am answering your questions but you are not answering mine?

Do you understand how I came up with 3.27 bits/symbol or do you need a hint?

🙂

Mung,

Yes. You’re confused right now, and I’m trying to guide you toward understanding rather than letting you compound your confusion.

Yes, you cribbed it. That’s the danger of cribbing — you didn’t understand what you were cribbing, so you gave the wrong answer.

Whatever relevant information you have about the actual microstate constitutes the macrostate.

In the first scenario, all you know is that the dice were thrown randomly. The macrostate can be expressed as “a random throw of the two dice”.

In the second scenario, you know that the dice were thrown randomly and that the sum is greater than eight. The macrostate can be expressed as “a random throw of the two dice, with the resulting sum being greater than eight”.

There are 36 possible microstates for the pair of dice. All 36 of them are compatible with the first macrostate, but

notall 36 are compatible with the second — only those in which the sum is greater than eight. Some of the 36 possible microstates have been ruled out. The missing information, and hence the entropy, has been reduced.See if you can calculate the entropy for each of the macrostates I described. Hint: Figure out what

Wis in each case, then compute the entropy from that.I can do those calculations, but I disagree with you that I am calculating the entropy of a pair of dice when I do those calculations.

The

entropyis 3.27 bits/symbol. And it doesn’t depend on my being blindfolded or not or whether someone tells me the result.Hint: it depends on the probability distribution.Now this is simply hilarious. Even walto is going to be able to see the self-contradicting absurdity given your previous statements about entropy being observer-dependent.

Who is wrong. Yolanda, Xavier, or Damon. Why, none of them are wrong! But Mung, he is wrong. 😀

Yeah I went back and studied some more. It doesn’t mean I don’t understand it. You’re confused right now, and I’m trying to guide you toward understanding rather than letting you compound your confusion.

How did I arrive at 3.27 bits? It actually does involve a calculation, you know. One involving a pair of dice. I’m actually even writing code for it. Perhaps I’ll write an OP.

36. W doesn’t change. In particular, it doesn’t change just because someone observes a different outcome

All the microstates are compatible with what was observed. Else the game of craps would hardly be possible.

Here’s what you want me to say:

a: 10 – (3,6), (6,3), (4,5), (5,4), (4,6), (6,4), (5,5), (5,6), (6,5), (6,6)

b: 6 – (1,1), (1,2), (1,3), (1,4), (1,5), (1,6)

c: 1 – (6,6)

I’d love to see where you’re going with this.

Do you say that that the entropy of (c) is = 0 because the exact microstate is known from the macrostate? Why did she choose to SUM the number of dots rather than multiply them?

In fact, all of your examples rely on summing the values shown on the dice. Why?

I say that the summing of the values is an objective feature of the dice and “observer-dependence” has nothing to do with it.

The word

arbitrarysprings to mind.I failed to make a distinction between outcome and outcome value. Observing an outcome after the fact does not change the underlying probability distribution and does not change the entropy calculated for that distribution.

The fact that (6,6) was observed after the throw of a pair of dice does not change the uncertainty (entropy) of an experiment involving tossing a pair of dice. The odds of throwing a 12 are still 1 in 36. The probability of throwing a 12 when tossing a pair of dice doesn’t change, it’s still 1/36.

The probability distribution for the throw of a pair of dice doesn’t change every time the dice are rolled, and neither does the entropy.

– Nash, Leonard K.

Elements of Statistical ThermodynamicsHardly a single particle.

Mung,

See if you can follow my argument:

1. You’re blindfolded. You vigorously shake a standard pair of fair dice, one red and one green, and toss them onto a craps table. They come to rest in one particular configuration: there’s a six on the red die and a three on the green die. That is the exact microstate.

2. The dice have been thrown, and they landed with a six on the red die and a three on the green die. It’s over and done. The metaphysical probability of (red=6, green=3) is now one. That’s how things turned out; the other microstates are no longer live possibilities. Their metaphysical probabilities are now zero.

3.

You don’t know that.No one has told you anything about the result of the throw, and you are still blindfolded. As far as you know, the dice could be in any of the 36 possible microstates, with equalepistemicprobability.Wfor you is 36, and the entropy isEven though there is one exact microstate, with a metaphysical probability of one, you don’t know which microstate that is. The entropy you calculate is not a function of the metaphysical probability distribution, but of the

epistemicprobability distribution, which for you is currently a distribution that assigns a probability of 1/36 to each of the 36 possible microstates.4. Now suppose your accurate, honest friend tells you that the sum of the numbers on the two die faces is greater than eight. The metaphysical probabilities haven’t changed. The metaphysical probability of (red=6, green=3) is still one, and the metaphysical probabilities of the other microstates are zero.

The

epistemicprobabilities have changed dramatically, however. Now you know that the microstate must be one of the following, in (red,green) format:(3,6),

(4,5), (4,6),

(5,4), (5,5), (5,6),

(6,3), (6,4), (6,5), (6,6)

Before there were 36 epistemic possibilities. Now there are only 10. The entropy is now reduced to

You gained some relevant information when she told you that the sum was greater than eight. There’s less missing information, and hence less entropy.

5. Finally she tells you that the number on the green die is a three. You’re down to one possible microstate: (6,3). The other microstates have all been ruled out.

Wfor you is now one, and the entropy isThe information she gave you was enough to pin down the exact microstate. There is no longer any missing information, so the entropy is zero.

Entropy is observer-dependent, and it decreases as the observer gains relevant information about the actual microstate.