Hello, everybody. In this unit I'll cover the notion of information and its relations to characterizing order and disorder in complex systems. We'll start by discussing the notion of entropy, which is a particular way of characterizing disorder. Then we'll try to make sense of the second law of thermodynamics, and we'll look at the famous paradox called Maxwell's Demon, which links physics to information. Then we'll touch on the field of statistical mechanics, which will give us the foundation for quantifying the notion of information. We'll talk about Claude Shannon's formulation of information and discuss its limits in quantifying information processing in complex systems. Finally, we'll hear from Jim Crutchfield, a physicist at University of California and the Santa Fe Institute about his work in a prime notion from information theory to understanding complex systems. Let's get started. In Unit 1 we saw several examples of complex systems in which self organization appears. For example, we saw how ants are able to collectively produce, organize structures such as bridges made up of their own bodies, how termites organize themselves to build intricate nest structures, how neurons become organized into functional units of cognition, how the immune system's trillions of cells organized to defend the body against intruders, and we saw how individuals organize themselves into ecologies, social networks, cities, and complex economies. In Units 2 and 3 we briefly cover the topic of dynamics and fractals and how the iterations of simple rules can lead to both simple and complex behaviors, how it can lead to organization and chaos. But another key to the phenomenon of self organization is the concept of information. In order to understand the phenomenon of self organization we need to understand how information is represented, communicated and processed in complex systems. As said by the physicist Murray Gell-Mann, "Although complex systems differ widely in their physical attributes, they resemble one another in the way that they handle information. That common feature is perhaps the best starting point for exploring how they operate." In this unit, we'll look at some ways in which information can be quantified. Historically the mathematical structure of information starts with the laws of thermodynamics. The first law simply says that in an isolated system, energy is conserved. An isolated system is one in which no energy can be added from outside the system or can escape outside the system. So what do we mean by energy exactly? Energy is defined in physics as a system's potential to do work. Work has a technical meaning here. But you can think of it in a colloquial way that is, work is getting stuff done. So very informally, energy is the potential of the system to get things done. Energy can take on different forms, and it can be transformed from one form into another. Here are a few examples. Energy from sunlight can be transformed into chemical energy in plants via photosynthesis. Electrical energy can be transformed into thermal energy. That's how your electric heaters or stoves work. Chemical energy from food or fuel can be transformed into mechanical or kinetic energy that we use to move via our muscles or vehicles. Now according to the first law, in an isolated system, well, energy can be transformed among different kinds. the total amount of energy in the system always remains the same. That was meant by "energy is conserved." Now, the second law of thermodynamics says that in an isolated system, entropy always increases until it reaches a maximum value. So what do we mean by entropy? Well, whenever energy is transformed from one source to another, there is nearly always a loss of energy that can't be used for work. This is sometimes called heat loss. The technical term for this is entropy. For example, suppose you're pushing a bicycle up the steep hill. Your stored energy from calories is being transformed into mechanical or kinetic energy that is the motion of the bike and your body. But there is a price to pay. This transformation also results in heat loss from your body. That is energy that can't be used for work. In other words, energy that doesn't contribute to moving the bike and you up the hill. This is entropy produced by the transfer of energy. Entropy can be thought of as a measure of disorder in a system. The second law of thermodynamics then says that disorder in a system always increases until it reaches its maximum value. Let's illustrate this by a simple example. I'll use one of the NetLogo library models. It's called GasLab Two Gas. So, let's go to Chemistry and Physics, and GasLab, and this one called GasLab Two Gas. OK, so I open it up. We look at setup. What this does is going to illustrate the behavior of two rooms with gases in them. And gas is just a collection of moving particles. So each one of these circles is a particle like a molecule. And this is going to simulate the physics of a gas. So I'm going to show you this model with the number of molecules on the right hand side being equal to the number of molecules on the left hand side. Cyan is the blue color here. Magenta is the purple color here. The initial speed will be 30 for the right and 10 for the left. I'll have them each have mass 8. OK, this will make it a lot simpler. OK, we'll do setup. And now what happens is these molecules start moving. I've set the initial speed of these blue ones to be faster than the initial speed of these red ones or purple ones. If we look at their speeds you can see that the blue ones have a higher speed, the red ones have a lower speed, and so on. So this is an ordered system because we have all the reddish particles over here and all the bluish particles over here. And they are not mixing. But what if we allow them to mix by clicking on Open up here. Let's see what happens. And I'll speed it up for a little bit. When they collide, they can lose energy and thus slow down. And you can see as we allow these molecules to just move according to the law of physics in this frictionless gas, very quickly they start mixing the red and the blue particles to average speeds, their average energies become equal. And the whole system becomes very disordered. So we start off, if I stop this. We start off from a very highly ordered, low entropy system, and if we allow them to mix, the system very quickly becomes a disordered, high entropy system. Now these things are just moving around kind of bouncing off each other, and conceivably by some strange chance, all the blue ones could come back over here onto the right side, all the red ones could come back over here onto the left side. But that's just incredibly unlikely. And that's the idea of the second law of thermodynamics, which just says in a system like this the system will become more disordered, it will not become more ordered unless someone puts some work into making it more ordered. The second law of thermodynamics has some profound implications. First, systems are naturally disordered. That is, they can't become organized without the input of some work. This implies that perpetual motion machines are not possible. A perpetual motion machine is one that might create and feed back energy to itself and therefore always be in motion without the input of additional energy from the outside. That would be a machine in which there is absolutely no heat loss. And the second law of thermodynamics implies this is impossible. And third, time has a direction. That is the direction of increasing entropy. We can see this in a striking example where we look at the movie of a glass breaking played backwards. Let's play that again. This is something you can look at and realize very quickly that is a movie played in reverse. And the reason that you know that is because you know that by the laws of physics things like broken glasses don't just by themselves magically get repaired and put back together. You have to put intense work into the broken glass to repair it even though it's very easy to break a glass. So we have a sense of which direction time flows in by looking at these kinds of movies. This is a fundamental result of the second law of thermodynamics, which in some sense gives meaning to the notion of time.