Let's now look at two prototypical examples of applications of Markov Chains What I've drawn here is something called A Directed Graph In a directed graph we have nodes. In this case there are 3 nodes/possible outcomes What we're describing with this graph are transitional probabilities between various outcomes for an economy You'll notice, in a directed graph, there are arrows pointing from one state or one outcome to another For instance we have the outcome of our economy in experiencing normal growth, mild recession or severe resession In between each of these outcomes we have an arrow again directed towards another outcome Attached to each of those arrows is a weight or a probability measure For example if I wanted to determine, based on this analysis, the probability that if our economy is in a normal growth state it will maintain normal growth That probability is quiet high, 97,1% On the other hand the probability that we transition from a normal growth at any given year to a mild recession is 0,029% and so forth There are aren't arrows between each of the nodes, in witch case there would be probability zero Because a mild recession represents something of an intermediate outcome there is no direct arrow between normal growth and severe recession and vice versa That probability is zero - first we have to devolve into mild recession before we hit a severe recession These numbers are actually based on recent statistics related to the U.S. economy They weren't totally plucked out of thin air in other words Based on this directed graph, I would like to define my stochastic matrix: "P" for this Markov Chain Then, given some initial state of our economy, we can then make a prediction We can in other words forecast bouts of our economic future Here we have the associated stochastic matrix we call it P That once again reflects the transitional probabilities as shown in our directed graph above For the numbers in column 1 we can think of interpreting them as transitional probabilities for our economy if we're currently in a natural growth state The probability in other words, when we're in natural growth if you look at the directed graph to maintain a natural growth is 0,971 and we see that number here On the other hand, to transition probability from natural growth to mild recession by natural growth in this column Mild recession in the second row that probability is 0,29 is kind of moving around the graph Let's say if we're in severe recession what's the transitional probability to maintain severe recession? That probability is 0,492 That corresponds with the number in the lower right If we're in this current state what's the probability we then transition? In other words, if we stay in severe recession we have 0,492 As I mentioned before, there are a couple of zero probabilities to go from one extreme case to another. So if we're in severe recession the probability of then go into a normal growth state is zero. There is no arrow from that node of severe recession to normal growth. Now that we have our stochastic matrix for this particular economic model, let's go ahead and use it in the context of a Markov Chain to then make some predictions Let's go ahead and calculate a Markov Chain for this example of economic forecasting, again recalling that here is our stochastic matrix = P representing the various transitional probabilities In the first computation here I'm just going to be very optimistic in terms of where we start. So our X- knot, our initial state-vector, I'll make; 1, 0, 0, where 1 = 100%, we're "all in" for normal growth, and 0, 0, is for the mild, and severe recession types We start of with a best case scenario, what does this Markov Chain predict for future economic outcomes? Remember then; By the "Markov condition", as we called it, each subsequent state in our chain depends only on the previous state The way we transition from state to state is that we multiply by the matrix P Let's go ahead and do that computation and find a prediction for a probability vector for year 1 Then we'll multiply our initial state vector 1, 0, 0 by these stochastic matrix P on the left That will then give us our state vector for the following year let's call it year one Notice though this calculation is quiet trivial, I'm essentially doing a dot product when I multiply matrices together. I'm dotting each row with 1, 0, 0 So really I pluck out and retain the first column from my stochastic matrix How do we interpret that state vector? Well, after our first year the outlook is as follows; We have a 97% chance of being in once again normal growth mode and a 0,29% chance of being in a mild recession and a zero % chance for severe recession That's well and good, but I'd really like, at this point, to pear deeper into my crystal ball. In other words I'd like to make a projection about a future outcome Let's say ten to twenty years down the road using this particular model The way I do that with the Markov Chain is I iterate the process of matrix multiplication To get, let's say, state vector two after year two, I multiply by P But you'll notice with each subsequent computation to get the next state vector in my chain or sequence, I just multiply by P So a nice shortcut is this; if I want to know for instance, The K-P state vector, that is my economic outlook in this case, after "k" years All I do is exponentiate, in other words take a power of my stochastic matrix, the k-power in particular, multiply it by the initial state vector. Then I get my prediction for the forecast for year k Once again we see a really nice application here of taking powers of a matrix Now we can efficiently make predictions about future outcomes using this particular mathematical model For instance what is the predicted status ten years down the road? What is our economic health and wellbeing going to look like? Take my stochastic matrix, raise it simply to the tenth power, Then I multiply it by the initial state vector 1, 0, 0 again That is the predicted outcome So at that point we've got 85% chance of transitioning to a normal growth Notice; creeping up here are the incresed probabilities of the recessions respectively And then, one more time, if I wanted then predict further on down the road 20 years, let's say I raise my stochastic to the 20th power, Multiplying again by the initial state vector and I get these following probabilistic outcomes So if you're curious you might wonder; what if we had started with a different initial state vector, a different X-knot in other words This was a very optimistic outlook indeed Can we be a little more cynical and see what then the model sort of conjures up for us Well sure, for instance if I assume, and maybe it's a fair assumption honestly in this day and age, if I assume that maybe I have a uniform probability vector to begin with, in other words, all outcomes are equally likely as my initial state vector, so if times are not so good economically we say 1/3, 1/3, 1/3 for those components Then what happens when I turn the crank with the Markov Chain and forecast 20 years down the road, in other words I raise my stochastic matrix to the 20th power, multiplying by this different initial state vector And I get the following outcome, which isn't drastically different but you'll notice a little bit more dire indeed Ok, so let's just summarize our findings between those two cases In one case we had this initial state vector that was representing our cheerily optimistic economic outlook. We used our Markov Chain to make a prediction 20 years down the road What is our state of affairs then? On the other hand, in the second case we started with a vastly different initial state vector, a uniform probability vector once again we used our same Markov Chain to make a prediction about the outlook once again 20 years down the road, now again, even though our initial vectors were quite different, the final state vectors, predicted by the models in both cases were not entirely dissimilar A good question to ask at this juncture is whether or not my Markov Chain depends heavily on my initial state vector? To what extent does tweaking that vector change the outcome in the long run of our Markov chain? We're going to answer that question momentarily And a closely related conceptual question which is of ultimate importance for Markov Chains is to ask the following; If I project way into the future into my sequence of state vectors defined by this Markov Chain, does that sequence ever settle down? In other words, to put that in mathematical terms is; Do my vectors converge to a particular value? If they do by the way, that vector is called a steady state vector for the Markov Chain. That means that my dynamic process has converged to a particular value. In other words; It's settled down!