What we focused on so far in both our introductory course and this course, is solving computational problems with MATLAB. As long as we got the desired results, we were happy. We haven't really paid much attention to the performance of our solutions. There were a few exceptions: for example, the recursive plotting of the Sierpinski triangle did get very slow after a certain depth and we tweaked our program to make it run faster. Also in the introductory course, we looked at preallocation to speed up certain functions that worked with large matrices. If you don't remember, don't worry we'll refresh your memory later in this lesson. In both of these cases, we wanted to make our program run faster. But these examples looked at minor changes to our MATLAB program as opposed to fundamental changes in their approach to the problem. We still use the same underlying algorithm. We just changed the implementation somewhat to make the program run faster. Don't get me wrong, these are important concepts that you need to know about and we'll learn other techniques later in this lesson. But in this first lecture of this lesson, we'll focus on something fundamental called Algorithmic Complexity. It's actually an entire branch of computer science that deals with analyzing the performance of algorithms. It tries to answer questions such as, how much time does this algorithm take? How much memory does it need? Well, as you might have noticed these questions are a bit vague. How much time does it take? Well, that depends on the computer and it depends on the inputs. It depends on a lot of things but what matters most is the algorithm itself. The same thing holds for the memory and needs. Don't worry, we'll make the questions more precise in short order, but first, let's start with a simple example to illustrate why the algorithm chosen to solve a problem really matters. Let's revisit our favorite mathematical series from high school, the Fibonacci Series. Yeah, not your favorite, I'm sorry about that but this isn't a calling show. We don't take requests. Like I said, the Fibonacci series, it starts out with two ones and from then on, every element is the sum of the two previous elements. Like 1 plus 1 gives 2, now we got one, one and two. 1 plus 2 gives 3, now we got one, one, two, three. 2 plus 3 gives 5, 3 plus 5 gives 8, 5 plus 8 is 13 and on and on we go. Our first element is one, second element is one, third element is two, three, five, eight, 13 is seventh element. The discovery of this series is shrouded in mystery, but we do know that the name of the person who first discovered it was not Fibonacci. In fact, even Fibonacci was not Fibonacci, but I digress, he should look it up. But let's say we are asked to write a function that computes the element in the series. If we wanted the element seven, our function should respond with 13. We don't have to worry about returning the entire series, just one element. With our new found expertise in recursive functions, we might feel that we can write an elegant solution to this problem in a recursive fashion. Here is what we might come up with. We're not going to worry about making sure that n is a positive integer. We know how to do it, but it's not the focus here now. This is indeed simple and elegant. If n is one or two, we return one. Otherwise, we recursively calculate the previous two elements one by one, and add them up. Let's test on some inputs. I'll start with one through six. Let's try, I don't know, 10, 20, 30. That's working just fine. So you may be wondering why we're even talking about it. Well, let's try a few more numbers, shall we? Our function seems to be slowing down here. Wonder how long 50 will take. Okay, that's why we're talking about it. We're talking about it because it's slow. I'm going to stop it with Control C. Whoa. This stack trace shows what active function calls we're going on when we stopped the execution. It seems to indicate that the program was busy doing the recursive calls. It shouldn't be taking that long, what are we missing? Let's think about that. When we call the function with n equals 50 and get to line five, we call fibo with 48. And once that's done, we call fibo 49. In other words, we call fibo with 49, and we add them together. But think about that. Fibo 49 will call fibo 47 and fibo 48. So we're already calling fibo 48 twice. That's not good, but it doesn't seem terrible. But fibo 48, will call fibo 46 and fibo 47, and since we call fibo 48 twice, and fibo 49 calls fibo 47, that means that fibo 47 is being called three times. It seems to indicate a trend. Let's instrument our function with a persistent variable to count how many recursive calls in total we make. Here it is. Fibo count. A quick refresher on persistent variables. They retain their values across function calls even though they're local. When first called, their value is the empty array. That's why we have is empty here, and an if statement to initialize the count to one. A persistent variable can't be an output argument. That's just a rule. So we add an extra variable called cnt down here to serve as the second output of the function. Let's try that. No surprise here, you need to remember to reset the persistent variable between calls like this. That sets it back to the empty array. Now let's call it again. But this time we'll call it with three. Let's clear again. Let's call it with five. As you can see, our suspicion about making too many recursive calls seems to be true. I mean, there's no reason to make nine calls to fibo to figure out the fifth element. But things get even worse in a hurry. Let's jump to 25. Oh, my goodness, the answer is 75,000. That's a big number, but we called the function a 150,000 times. How of 35? Thing are slowing down. I'm expected my fan starts speeding up. Eighteen million, six hundred and, oops, I forgot to clear. We'd have to subtract the 150,049 to get the exact number. But that's peanuts. To compute the 35th element, our simple, and elegant seven line function made over 18 million function calls, instead of just doing 34 additions. Wow. That is truly inefficient, and wasteful. What's wrong here? Is it the MATLAB program? That is, did we make a programming mistake? No. It is our algorithm. It's actually our approach to the problem. We didn't realize that doing that this particular way, that is calling the function itself for n minus two, and n minus one, ends up making a lot of unnecessary work. So our algorithm is bad. Let's qualify that statement. Function seemed to work fine for small ends, right? Up to say, n equals 25, it finished with a little delay. But once n started to get bigger, let's say n equals 35, then it slowed down. The speed of the function depends on the input. All we ever need this function to do, is give one of the first 25 elements, the Fibonacci series, then this algorithm works just fine. But if we need larger elements, or if this function will end up being called many times by other programs, then we should care about the speed, and efficiency of the function. What's the correct solution to the problem? First of all, you can write a non-recursive function that will use a single loop. It's easy. I'll let you do that on your own. The recursive function can take two different paths. We can modify the requirements, and return the entire series, instead of just the last element, like this. We make only a single recursive call to figure out the Fibonacci series for n minus one. Save it in a vector. Then simply add the two last elements to get the nth element of the series. If you try this, you'll see that it's lightening fast. Little bit better. If returning the entire series is not acceptable, we can just hide this version as a sub-function. This is definitely not as elegant as our original solution, but it has the advantage that it actually works for large inputs as well. Let's run this one with 50. That's what I call fast. Let's time it. Elapsed time, seven, ten thousandths of a second. Yeah. That's fast. In fact, it's so fast that there's just no reason to keep working on it. This wraps up our first lecture on the topic of algorithms, and efficiency. We've achieved our goal, which was to demonstrate that the algorithm chosen to solve a given problem, can have a huge impact on the speed of the program we end up with. That's it for this lecture. But that's not it for algorithmic complexity. There's a lot more coming because complexity, well, it's a complex topic. Get it? Well, we'll take a deeper dive into our topic in the next lecture with the intriguing title, "Algorithmic Complexity: Part 2".