Hi. Welcome back. In this lecture, I wanna flesh out a few more of the particulars about the concept of path dependence. I want to relate it to an earlier notion of the Markov process, which wasn't in any way path dependent. Remember we got a unique equilibrium in that case. And I also want to relate it to chaos. And then finally what I'd like to do is just flesh out a little bit more about why the distinction between path dependence and phat dependence is so important, outside of the context of the urn model. So, I wanna talk about it sort of in a real world setting as opposed to within the context of that simple urn model. Okay, so let's get started. Remember, when we talk about path dependence, what we're talking about is the sequence of previous events influencing not only outcome in this period, but possibly the long run equilibrium. So our definition of path dependence is that the outcome probabilities depend on the sequences of past outcomes. So in the case of a path dependent outcome where you see even the outcome depends on it. In the case of path dependent equilibrium we're saying that the long run equilibrium depends on the path past outcomes. Now remember when we studied Markov processes in the previous lecture and in the Markov processes we always got a unique equilibrium. And in the Markov process we made the following assumptions. We said there's a finance head of states. We said there's fixed transition properties within those states that you could get from any one state to any other. And that it wasn't as. Simple cycle. So it didn't go A, B, C, A, B, C, A, B, C. And then, given those assumptions, we get something called a mark off conversions theorem that said that given A1 to A4, Markov process converges to an equilibrium that's unique. Now remember that was a [inaudible] equilibrium so it was moving between the states. It was still churning, but it was a unique equilibrium. So why aren't Markov processes path dependent? Well, here's why. This assumption right here, fixed transition probabilities. Remember in our urn model, we've got this urn and as we, we've got red and blue balls in here. And as we pick more red balls, we start adding more red balls. So the transition probability. Change. So it's those fixed transition probabilities that are really underpinning why the mark off process goes to a unique, even though it's a stochastic equilibrium, goes to a stochastic equilibrium because we're not changing the probabilities. The history of events doesn't change the probabilities. So, now by comparing these two models, we see sort of, when does history matter? History matters when it changes the transition probabilities. There's two ways of seeing the effects of history on outcomes. One through the mark off process, by saying it, history doesn't matter if the probabilities don't change. And the other is through the urn models by showing history does matter if the outcome. Those change. The probabilities change. What I want to do next, is relate this to chaos. Now relating to chaos I got to begin by describing some recolor of recursive functions, so recursive functions were sort of implicit in our mark off model and in our urn model, but let me make it more formal. So in recursive function what you've got is you've got an outcome at time T and there's an outcome function. Math, and math acts into itself. So it's this process that's kind of moving on and on and on. So you've basically got an outcome, you've got another outcome, and another outcome, and another outcome. So one thing needs to be X plus two, especially if you just go one, three, five, seven, nine, eleven. So in the urn models that we had we had a precursor where we picked out, X could be either blue or red, we picked out blue red, blue red, blue red. But what we got in each period depended on what we picked previously. And in some cases on the whole set. So in some cases, what you get in expect can only depend on a previous variable, or it could depend on what happened in period one. What happened in period two and what happened in period three. So you could have that X4 is a function of period one, two, and three. That would be path dependent process. In the simple recursive function, what happens in this period F of XT might only depend on. We have X [inaudible]. Xt plus one might only depend on XT. So [inaudible] only depends on the previous proof. We can use this regressive functions, to describe processes that are chaotic. So when we talk about chaotic. Chaos what we mean is, extreme sensitivity to initial conditions. So what that means is if I start with two points. And are very, very close to one another. And then I keep applying this recursive function, what I'm going to get is these paths are going in very different ways. So two points that start near each other, end up a long way away. So let's see an example of that, this is called a tent map. So let X be in interval 01 and these round brackets mean that I don't include zero, I don't include one. Now the function is defined as follows: F of X equals 2X, if X is less than a half. So here's zero, here's one, here's one half. So it?s equal to 2X if it's less than. And then it's 2-2x if X>1/2, so what that means is that if X=1/2 I'm gonna get 2-, it actually equals one-half here so I'm gonna get one, and an X=1/2 here I'm gonna get 2-1 which is also one so this looks like this and it looks like. A tent. Hence the tent man. So the way it switches, if I start out at point 2,1 and I apply it, I'm gonna get point 4,2. And then if I get to point 4,2 and I apply it, I'm gonna get point 8,4. Well then if I look at point 8,4 and I apply it, then I'm gonna apply this. And I'm gonna get two minus two times. Point 84 which is going to be two minus 1.68, which is going to be.32. So I'm going back to.32 and I'm going to get.64. So that's how the tent map works. I just recursively go through the function. Here's an example of the tent map, where I start with two points that are very similar to each other. One is .4321, the other is .4322. Well, again, first I double it, then I apply 2-2X. And then I double that, 'cause it's less than a half. And I apply 2-2X and so on. And notice after I do that just a few periods, these two points are now a long way away from each other. So the tent map ends up being chaotic, because there's extreme sensitivity [inaudible] initial condition, just by being a [inaudible]. Teeny bit different on the fourth decimal point, you end up a long way away just after eleven iterations of the function. Now you can see this graphically as follows. Originally you can't even see the blue line, because it's hidden behind the red line. So, this is the same things I just put in. And over time, these two paths end up being very different. This is extreme sensitivity to initial condition. Notice this is not path dependence. Why is this not path dependence? Well let's go back. This tent map is just a fixed recursive function. Once I choose my initial point, once I choose my four, three, two, one, or .4322, then I know exactly what's gonna happen. So this is extremely sensitive to initial conditions but it's not path dependent because all that matters is the initial point. Now to find the path as being the initial point then yes it's path dependent, but nothing that happens along the way really has any effect on what's going to happen in the long run because we know what's going to happen. Once we choose the initial point we've just got a fixed function. So chaos, in its standard form means extreme sensitivity to initial conditions. So the initial point matters. And if I apply this function over and over, tiny differences in the initial point will vary [inaudible], by a lot later on. Path dependence means, what happens along the way influences the outcome. So it's typically not a deterministic process, 'cause what happens along the way has an impact on the outcome. [sound] So let's step way back for a second. We think of a process as being independent, if outcomes don't in anyway, depend on, the past history of outcomes. We can think of a process as depending on the initial conditions if the outcome or state in a, in a later [inaudible] depends only on the initial state. It's completely deterministic. So this, this independence, is a probabilistic concept. That, you know, there's a 50 percent chance of getting a red or blue ball each period. With chaos, extreme sensitivity to initial conditions, we're saying, it's deterministic. We know what's gonna happen once we get to the initial point. And all that matters is that initial point. Path dependence means that the outcome probabilities, what happens in the long run, depends on what happens along the way. And finally, we have, then, fact dependence, means that outcome probabilities don't depend on the order in which things happen. It only depends on the set of things. So in our [inaudible]. [inaudible] What happens in, if we've got 24 red balls. In six blue balls, sitting in this urn. It doesn't matter what order they appeared in, all that matters is the number that there are. So, that's the difference between. Path dependent and fact dependent. Now when historians or, you know, institutional scholars think about path dependence, they often think in terms of the sequence of events hap, mattering. Not just the set of things mattering. They also think that things aren't independent, and they think that although conditions matter, they're not the only thing that matters. So they don't think it's the case that once we write the Constitution, that, you know, what then plays out is completely deterministic. So they tend to side with things being path dependent. Why? Why do they think. Path dependence and nothing else. Well, independently there's no structured history so that doesn't make any sense. Extreme sensitivity to initial conditions in undeterministic process doesn't make any sense either. So that means that fate is just completely predetermined by a few initial choices. So, it really comes down to path versus fact. Path says the sequence matters, fact says the set matters. Why do they think it's the path not the set? That's I think. Seemingly a deep question. These nice urn models have held us, make us think about it. Well, one reason that they think that this is too zippy, that early events. Have larger importance. Let's think about some events. So let's suppose that this was what American history looked like. In 1814 we gave women the right to vote and then in 1823 we had a civil war to get rid of slavery. In 1880 we finished the transcontinental railroad across the United States. In 1894 we find gold in California. In 1923 we decide to buy the Midwest from France, so previously we put this transcontinental railroad through, we had to negotiate with France to put it through, what later became the Louisiana Purchase and then in nineteen. In 67 we have a brutal war with England for independence. And it's hard to imagine, that if this was the sequence of events, that I'd be sitting here giving you this course right now. That it's probably the case with American society would look extremely different then it does now. In particular, I might be speaking French, because I would be in what was formerly the Louisiana Purchase, and we'd probably look a lot more like Quebec. In here in the United in Michigan, than we look Then I look now. Okay, so. When we think about these ideas. Path dependence. Fact dependence. Independence. Sensitivity to initial conditions or chaos. What we see is these simple models help us organize our thinking about the world might look like. And we understand why a lot of historians focus so much on path dependence. Because it seems the most reasonable. We also see why people who, you know, study gambling in casinos consider independence. And we see why a lot of physicists are interested in things like chaos, because there are actual physical recursive that produce this extreme sensitivity to initial conditions. So, these simple models help us make sense of a lot of concepts that are actually fairly closely related logically, and the URN model, in particular, helps us make, draw bright lines between path dependence, PHAT dependence and Independence. Okay. Thank you very much.