So, now we have, we have described the algorithm merge sort, and again we illustrated it on this list. When we took the list, we kept dividing it until we got to the leaves. For each one of them, there was nothing to do, they were already sorted. We go up and do the merge. As I be when I, when I began describing the algorithm, or before we start describing the algorithm, I said we are going to go into an o of n log n algorithm. So what is the efficiency? Well why does the efficiency of this algorithm, why is the time complexity of this algorithm, o of n log n. So, this is where we, we need now to analyse this. Now, this algorithm, notice that this algorithm is different from algorithms we have seen so far. This algorithm is a recursive algorithm. I say, I want to sort this list. I don't know how to do that, I'm going to divide it into two sub-lists, and I'm going to sort them. So, notice that, to sort the big list, divide it into two halves, and then sort to halves. But I'm using the same term again. I'm saying to sort these, to sort this big list, sort the to halves. But this is not really sorting, because I just invoke the same word again or the same function again and to sort this, sort the two halves and sort each one of them, sort the two halves. So this is what we call a recursion here, right? I'm, I'm calling the function multiple times, but every time I'm calling the function, I'm calling it on a smaller instance of the problem here. So, I go sort merge sort on this list then it will do merge sort on this merge sort on this merge sort on this is going to to do merge sort on this to in each one of these merge sort on this is going to do merge sort on each one of them. So, this function is merge sort is going to be called once on this its going to be called once on this once on this its going to be called here, here, here, here. It's going to be called again on each one of these. So it's going to be called many times. At the end when it is called here, when I say merge sort of this list, it will look at the list say okay, it has one item, done, it is sorted. And then after that, the merge comes into the picture, and it starts doing the merging. So no more sorting now, it's just merging them. Right? So what is the running time? And here, but since this is a recursive algorithm, we do, the technique for analysing the running time of such an algorithm, is a bit different now, right? So I am doing [COUGH] let's look at it is that, when we are doing merge sort, [SOUND] of a list. Of n items, let's say for example, ni, it has n items from position zero to n minus 1. What merge sort is going to be doing, it is going to be doing three things. It's going to be doing merge sort on L. And the first half. Okay? And then, it's going to be doing merge sort on l. The second half, which is n minus 1. And then it's going to be doing the merge of these two. Once I'm calling merge sort on this, I get it sorted, I get merge sort on this, I get it sorted, I will do merge. So if I look at this, once this is sorted, once this is sorted, I'm doing merge. Okay? So I'm doing merge sort on a list of size n, to get this, I'm doing a merge sort on a si, on a list of size n over 2. Merge sort in a list of size n over 2, and I'm going to merge the two. How long does merge take? Suppose I have two lists, this is of size n over 2 and this is of size n over 2. Remember these were a, b and I copied them into c which is of size n. Right? This, the merge is going to look at these two items first. Compare them and copy the smaller, then the second, and comp, sa, copy it and so on. This list is going to have n items, okay? When we began it was empty, when the merge ended, it was n items in it. So the win, any items copied here, so we know it has at least, it does at least o of n amount of work. And in fact, it does o of n amount of work, because for each item, I'm doing just one comparison here, all right? So, for each element here, I'm doing a constant amount of work. So, merge of these two lists into this, is going to take o of n, okay? So, I know that this merge. Is going to take o of n amount of work. The question is, how long does the entire function take? So suppose this, I'm trying to gather the running time of my merge sort which is t of n. This is a function that is a number of steps that merge sort will take on a list of size n. So what is t of n? T of n equals what? Right? I know t of n equals o of n, for the merge, plus something, because in, we're doing the merge, but we are doing also two other things. Right? So, it's o of n for merging the two halves, but how long does it take to do merge sort, and to do merge sort? For a recursive algorithm, probably, actually probably your first impression was, okay let's repeat the same exercise and say, okay. Now, how long does it take this, and this, and then how long does it take this. And keep repeating this exercise, and adding these kind of terms. But for a recursive algorithm there's a much simpler technique, and a much elegant technique for analysing this. Which is, I know that the running time of my sort is t of n. I don't know what t of n is yet, but I know it is t of n, I'm call it t of n. Now, if this is t of n, it takes t of n to, to do my sort of a list of size n. This is the same function exactly the same function not modified at all, but applied to a list that is half the original size. So, this is the same function like this, but applied to a different instance of the problem. This was applied to the big list, this is applied to half. So if this takes t of n. If this takes t of n, this is going to take t of n over 2. This is t of n over 2. And this is the same story, t of n over 2. So, what we get from here, is that we have 2, t of n over 2 [SOUND] this is the running time of merge sort. Now, [COUGH] when I, when I write something like this. This is probably the first time you have seen something like this. And it doesn't look like anything we have seen, so far. So, far we have been talking about running times that are o of n squared of 2 to the n of n factorial. But what is this all of a sudden? How does this function grow? Is this good? Is this bad? And I said in the beginning that it is going to be o of n log n. You don't see n log n here, right? So, what we have here, this function, t of n equal 2, t of n over 2 plus o of n [SOUND] this is called a recurrence. [SOUND] And recurrences arise when we analyse the running time of recursive, [COUGH], recursive algorithms. Because when we run a recursive algorithm, we want to describe the running time of the algorithm, t of n. We are usually calling the function, the same function, itself. So, we will see the same function t, but on different in, on different parameters, okay? So this is a recurrence. And when we deal with recursive algorithms, we are going to see these recurrences arising, to describe the running time. And, while this is does not give us the sort of functions that we are used to seeing, and give us an idea of how bad or good the algorithm is. Well, there is a technique for taking this recurrence, and writing it in terms of an explic, explicit formula. So, I can take this and I show that this, in fact, evaluates the o of n log n. But, when we have a recurrence, when we have a recurrence like this. And before we, we, we are able to, to get the, the explicit formula for t over n, we also have to put a rule for the base case. Because look at this, t of n is 2, t of n over 2 equa, plus o of n. Then I can say it is 2 of, this is t of n over 2. But what is t of n over 2? It is t 2 times t of n over 4 plus o of n over 2. And I can keep repeating this exercise and getting all these copies. But where do I stop, all right? Where do I stop? So, where does, where I stop is exactly where the algorithm stops. Where did, how did we define merge sort? We defined merge sort to stop when it gets to a list of size one. So when it gets a list of size one, all it does, it just returns, because it doesn't do any, any more work. So, this is why we write that. We have a base case here, and I have to define it. Which is in our case here, it's for a list of size one. When the list of size one, this algorithm, does the constant amount of work, which is o of 1. All right. So, when we write our recurrence, [COUGH] we have to write the recurrence formula, and we have to write the formula for the base keys. The base keys should not have, should not be a recurrence, it should be again a specific explicit formula, for how long it takes the algorithm when we reach the simplest instance of the problem. So for merge sort this is really what we get, this is the running time of merge sort in terms of the recurrence. The question again how does this grow and why is this all o of n log n, and this is what we are going to talk about next.