In this video I want to draw out and highlight some of the key themes and realizations from the study of dynamical systems. Some of the things that I think that are most important and interesting. And I should mention that I don't think there's one single lesson or set of lessons that come from dynamical systems. And in particular I think the importance of dynamical systems and the realizations that come from it will be different in different fields. Certainly what's important to a mathematician might be different than what's important to a biologist or an economist. And, also I should mention that there's certainly not universal agreement on what those important things are. So one of the goals of this course is to give you a mathematical grounding in the basic phenomena of chaos and strange attractors and the like. So you can form your own opinions about all this. Nevertheless, I hope these thoughts or comments are of interest, and at least give you something to think about. So, let's see. One of the things that I think is most important about the study of dynamical systems is this realization that simple dynamical systems, like the logistic equation or the Rossler equations do not possess, or do not necessarily possess, simple dynamical properties. And another way I like to think about this is that there can be a difference between the nature of a process and the nature of the thing it produces. So for example, the logistic equation is a deterministic equation. It's a simple iterative rule. It's about as simple and almost as boring as can be, but the simple deterministic equation produces aperiodic and random, or seemingly random output. So there's this distinction between the characteristics of the generating process; in this process a simple deterministic equation, and it's output which in this case is an aperiodic, in a sense, random sequence. So, I think this has all sorts of important implications some of which we talked about in Unit 3. So, for example if you see a process that's behaving apparently at random, wild fluctuations or something of the sort, you might think, "oh, there must be something, some external process going on. It's not following a simple rule. It's being moved about at random." And that could be the case. It also could be the case that there's very simple intrinsic dynamics that are making that apparently random behavior. You can have a system that's rule bound and describe in simple terms but behaves in ways that are seemingly opposite to that. Another example where we can have behavior that seems opposite to that which is producing it is in the study of bifurcations. So when we looked at the logistic equation with harvest, we saw that there's a stable population and as you increase harvest the stable population decreases. But there is this critical point, this critical harvest rate, and if you increase it just a little bit beyond that harvest rate the stable population doesn't go slowly down to zero but it just disappears. So it's a sudden discontinuous change. The differential equation itself is perfectly continuous. But the behavior of the equilibrium points, or the fixed points, can be discontinuous. So if you see something, a system that has a sudden change. You had a steady population of 100, and a little while later the population is just gone, you might very logically expect some external influence, and some large external influence must've been required to move that population from 100 to 0 all of a sudden. And indeed, that could be the case, but dynamical systems also shows us that there could be an intrinsic cause for that; internal to the dynamics of the system and in particular, it could be that a very small change in the parameter leads to a very big change in the equilibrium value of the population. So a continuous system, a differential equation can have discontinuities built into it. Again, the process generating a phenomena is in some ways opposite in character from the process that is generated. Another example of this phenomena I'm trying to describe is pattern formation, which we talked about in the last unit. You can have a simple dynamical system, spatially extended in this sense, but something that's local. That each point is the future value of each point is a function of just what is going on at that point, and you can even add diffusion into it. This force, so to speak, that tends to smooth everything out. And that even in the presence of all that a local diffusive rule with some reactive terms in there as well, one can get stable spatial structures. So these patterns come seemingly from nowhere, from an almost perfectly homogenous initial condition that diffusion doesn't smooth it out but it can make these patterns emerge. So if one sees a patterned outcome, in physics or biology or chemistry or economics, it could be that somebody made that, drew it up like one would make blueprints for a house. Or it could be that is just what the system does without any designer or direction from above, it self-organizes. Certain types of patterns can be formed spontaneously. So again we see an example where the process generating a phenomenon, in this case a local deterministic rule, is in some ways opposite to the outcome of this global stable pattern. So I think one of the key themes, key ideas, from dynamical systems is this separation between a process's output, what the output looks like, it's characteristics, and the process that generates it. I think that's an important realization for all of science and certainly for the study of complex systems. Another important realization that comes from the study of dynamical systems, is this idea that order and disorder are not necessarily opposites; that they intermingle and coexist in a lot of interesting ways. One way we've already talked about is sensitive dependence on initial conditions, or the butterfly effect. Where we have a system that's deterministic but nevertheless is impossible to predict over the long term. Because of sensitive dependence on initial conditions. And in fact, it's almost as if the function is so deterministic, it depends so much on the initial conditions, it obeys that rule so precisely over time that precision that makes it unpredictable because one would need to know the initial condition, the initial value, to an impossible precision in order to do long term prediction. So in some ways, that's a negative result. That says that there's certain processes, predicting the weather perhaps, that we'll never be able to do long term prediction on. But I also think it's a positive result for a number of reasons. One, it's a positive result to know limits of knowledge. It's certainly better to know our limits than to not know them. But also, there's certain types of chaotic systems that may not be predictable in some sense but may be predictable in an average sense. And we saw that in strange attractors. Which is a really nice and vivid example of how order and disorder can coexist in the same system. So in a strange attractor we have chaotic motion, sensitive dependence on initial conditions, but the structure or the shape in phase space along which those, the butterfly effect exists, along which trajectories go, it is a stable structure, it's an attractor. So if we have a lot of initial conditions, far away, they'll get pulled into the attractor and the ones on the attractor will move around chaotically, aperiodic with sensitive dependence on initial conditions. So for motions that are described by a strange attractor, we can't do predictions of an individual trajectory, but we can make statistical predictions precisely because the strange attractor, the overall shape that the trajectory moves along in phase space, is stable. That is predictable. So we can make very clear statements about how often the system will be in this region of phase space as opposed to this region or whatever the situation may be. So we can have a certain statistical, or long term, global stability with a local instability that gives rise to sensitive dependence on initial conditions. So it's an interesting mixture, this combination, in one phenomena of disorder and order. More generally, I think there's often a presumption in science and in modelling maybe, that order and disorder are just completely different things. If you see something that behaves kind of ordered and kind of disordered, you'll have to understand the ordered system one way with a nice set of equations and the disordered part you'll need to do statistics or a stochastic process on or something. And that can be a fine approach but this also says that order and disorder can exist in the same system. In a sense can be just different sides of the same coin. So, in a lot of different ways, I think dynamical systems challenge this idea that order and disorder are opposite and it also challenges the idea that simplicity and complexity are opposites. We see lots of examples of complex behavior coming from simple equations. Another issue that comes up in dynamical systems, and elsewhere but we've talked about it in this course that's a theme that's worth highlighting, is the different ways mathematical models are used. So, very roughly speaking, one can imagine two approaches to making a model of a phenomenon. One is to try to get every detail correct and try to have a model that can be as predictive as possible. That's a valuable approach but it's also sometimes of interest to come up with a model that's deliberately simple; that leaves out certain features but preserves maybe some features you don't care about but preserves some features that you're most interested in. So in this sense this type of model is more like a caricature. It's designed to just capture some essential element of a system and not try to reproduce every detail about that system. Often those types of models can lead to better understanding, or at least a different type of understanding, than those really large models that try to capture everything. So these simple models can just suggest maybe there's some simpler common mechanisms for similar features that are seen across a number of different phenomena. Maybe the best example of that, is the idea of universality. So that period doubling route to chaos is universal in the sense that certain aspects of how that occurs, ratios of lengths of successive periodic regions, will be the same not just for mathematical systems but for physical systems as well. So that is a stunning example of how a very simple mathematical model can give rise to not just qualitative understanding but numerical predictions about physical phenomena. Personally, I tend to think that universality will not be seen in the same strong sense that we see it in dynamical systems in complex systems like ecosystems or complex networks or economies or the like. So I think that universality in the strict sense might be a little bit of an exception; that we won't see that all the time. However, I do think it's the case that these simple models provide really valuable input into origins of complex behavior and random behavior as well. It's not the only type of input you'll want but I think it's a really important realization. So, lastly, there's a question of "What is chaos?", "What is dynamical systems?" and how to think about in relation to the rest of science and the rest of math. Is it a revolution or a paradigm shift like quantum mechanics is? Well, we discussed this some in the interview with Stephen Kellert, the philosopher of science from Hamline University. There I think, to me at least, chaos and dynamical systems is not a revolution in the way that quantum mechanics was for instance. Quantum mechanics completely changed our picture of the universe and required not just accepting some new things but rejecting some other things. Chaos, I don't think quite reaches that level. It doesn't really say that Newtonian mechanics is wrong, it just says that Newtonian mechanics, this rule based idea of the universe, this mechanistic clockwork universe, is actually a lot more interesting. It turns out clocks can do all sorts of really cool things; not just be clocks. They can be chaotic and random and make patterns as well. So anyway I don't think chaos and dynamical systems requires us to necessarily reject physical theories, or entire physical theories, but it does require us to rethink certain concepts and certain ideas. A nice description of this is in a history of science paper by David O'Bahn and Amy Delmitico. They argue that rather than thinking about chaos as a revolution, sudden change, they take a longue durée view of history. So a long arc, many different streams of thought flowing together and what emerges in this area of chaos and dynamical systems, they call a vast socio-disciplinary reconfiguration. It changes the conceptual landscape. So disciplines and trains of thought flow together and come up with, perhaps a common language or common framework that reconfigures certain categories; like order and disorder like we talked about before. Realizing those don't have to be opposites. Realizing you can get aperiodic behavior from a repetitious iterated function and the like. So it reconfigured some of those ideas and disciplines have in a sense flowed together so there is a common language, at least in part, across biology, economics, math, and physics. So these notions of strange attractor, phase space, Lyupanov exponents, butterfly effect, sensitive dependence on initial conditions are all part of this new framework. It's a way of communicating ideas and it's a way of doing certain types of mathematical modelling and maybe it's a way of looking at the world. Hopefully, through this course, you're now part of this convergence of ideas in chaos and dynamical systems as well.