Of course we have a nice netlogo model to illustrate our slot machine. This was kindly written for us by our TA,
John Balwit. If I click on reset, it shows me the slot machine, and, we have our three windows, with our fruits,
and I can set the number of pulls that I want---I'm going to set it to one, to start out with, and click on
"Pull Lever N Times". The lever is pulled once, we get a new microstate, pull it again---oh, we get three
of a kind there, very lucky! Ok, so, we can set the macrostate we're interested in. I'm going to set it
as three of the same kind, ok... and, I can ask "how many times will I be likely to see that in, say, 1000
pulls". Well, let's figure that out, and I'm going to do that by writing a note. Ok, so my note is going to be, say,
"Probability of seeing macrostate 'three of the same'"---well, we said that there were five microstates that were included
in that macrostate, that is, cherry cherry cherry, lemon lemon lemon, et cetera et cetera, and there were 125
possible microstates, so that probability is going to be equal to 5 divided by 125---put that all on one line---
five divided by 125, which is equal to .04. So that's the probability, if we pull the lever once, that we'll
see three of the same. Four percent. Therefore, the expected number of times we'll see our
macrostate in 1,000 pulls, that's going to be equal to the probability for one pull times the number of pulls,
which is equal to 40. So let's see how close we are---that's expected, of course there's some randomness
here. So I'm going to reset, and pull the lever 1,000 times. And, we can speed it up a little
bit, perhaps... and we're seeing---it slows down a little bit, when it hits a jackpot, so we can
see the jackpot---but you can see that, here's our number of times its seeing the goal macrostate---
three of the same kind---and here's the number of times it's seeing the non-macrostate---
this is like the "win" macrostate and the "loose" macrostate---and so
we're experimentally verifying our theory, which says that we'll see it about 40 times---well, 46.
Of course, if we run it again and again and again and averaged them all
hopefully they would come out close to 40. So you'll be able to use this
to do some of the homework problems, which involves some of other kind of macrostates,
which we'll look at a little bit later. Now let's bring all this back to our discussion of
entropy and statistical mechanics. So, remember our netlogo model, the two gas model,
where we had two rooms, and one room contained the slow particles, and the other room contained the
fast particles, and we opened up a gap, and they started to mix, and they started to mix,
so we got this at the start, and this and the finish, and we said that the entropy here was lower
than the entropy here, that is, by the second law of thermodynamics, entropy
increased. Well, in our new language, of microstates and macrostates, we can say that
the microstate of the system is the position and velocity of every particle---that's kind of like our
position and identity of each of the fruits of the slot machine. Here, we have one macrostate---
all fast-moving particles on the right, and all slow particles are on the left---and over here
we have another kind of macrostate---fast and slow particles are completely
mixed. Well, if you think about it, our macrostate on the left hand side corresponds to fewer
possible microstates than the macrostate on the right hand side---that is, there is more ways that different
particles could be arranged, in terms of position and velocity, to create a completely mixed,
fast-and-slow particle macrostate, than there are ways in which you can create this more ordered
macrostate. So here, on this side, of course, there's many different ways in which the blue particles
could be individually arranged and the red particles could be individually arranged in order for
all fast particles to be on the right and all slow particles on the left, it's just that
there's fewer such arrangements than there are arrangements in which they're
all mixed. So there's many different places this little red particle could be or this little blue
particle could be so that these are all mixed---and that's true of all of the different particles.
So that is the statistical mechanics notion of higher and lower entropy, and it
corresponds very well with our intuitive notions of "more disordered" and "more ordered"
states. This gives us a new way to state the second law of thermodynamics. First, our
original way we said that in an isolated system, entropy will always increase until
it reaches a maximum value, but now, we can look at the statistical mechanics version of
the second law, which says that in an isolated system, the system will always
progress to a macrostate that corresponds to the maximum number of microstates.
Well, Boltzmann's definition of entropy is conveniently engraved on his tomb
in Vienna, so noone ever has to forget it, and his definition says, the entropy S
of a macrostate is some number k times the natural logarithm---that's this "log",
the "natural logarithm"---of the number W of microstates corresponding to that
macrostate. Well, k is called Boltzmann's constant---the constant and the logarithm are just
for putting the entropy in particular units. So, you could really look at it as
S equals W---Boltzmann's entropy equals, or is proportional to, in some sense, the number
of microstates corresponding to the macrostate. So entropy is a measure of a macrostate,
and it measures how many microstates correspond to that macrostate. So the
general idea is that, the more microstates give rise to a macrostate, the more probable that
macrostate is. So our slot machine, the macrostate of "loose" was much more
probable than the macrostate of "win", and we saw that many more microstates
corresponded to the "loose" macrostate than the "win" macrostate. Intuitively, high
entropy just means a more probable macrostate. Or, given our gas example, it's
much more probable that, if the door is open here, the molecules will mix, than that they'll
just stay or re-arrange this state in which all the fast ones are on the right and
all the slow ones are on the left. It's much more probable that they'll be mixed, and be in this state,
than in this state, so we say that this state has higher entropy than this state.
We can now do a final restating of the second law of thermodynamics, using
our statistical mechanics terminology, and we can say that, in an isolated system,
the system will tend to progress to the most probable macrostate. Well, this may seem like
a tautology, but actually, it's one of the most profound ideas in all of physics, and it gives meaning
to the notion of "time". You'll find some optional readings suggested on our Course Materials page
that delve much more deeply into this idea than I have time for in this course.