Hi, welcome to our lecture on sequences. This lecture is going to feel a little different than prior ones. You might think to yourself, "Well, how is this all related?" But, through the beauty of math, you'll see we're able to study these things called sequences and other sequences built from them using our tools of calculus. Let's start off with first the definition of a sequence. This is a little boring of a definition to be honest, but a sequence will be a list of numbers. Good enough for our purpose. Sequence is a listed numbers in some defined order. What do I mean? You've probably seen this before and you listed out some numbers you have like a1, a2, a3, and then you can stop. There's a finite list, is a finite sequence, study some set theory, you could think of it that way too. Of course, we're going to be interested in here as where calculus is going to come in in the study of what does it mean for things to be infinite so we're going to be more interested in the case when I have an infinite number of terms in a list. We're going to abbreviate this using a sub i, 1 infinity or you can imagine taking limits. These are the objects of study in this chapter. Just as an example, what does it mean, it could be whatever you want. You can have a nice boring sequence. These are all going to be infinite, but let's say we have, one, one, one, forever, forever and ever. That's a perfectly good example of a sequence that we're interested in. What if I had like one, two, three, four, forever, forever, forever, off it goes. Instead of listing them all out, maybe I'll give you a formula to list them. How about I give you some formula where I say the sequence is defined by i over i plus 1. I'm handing you a formula, but you need to realize, "Okay, so i is 1, that's 1 over 2, that's 2, that's two-thirds." I can hand it to you in a formula instead of writing it out in list form. But this is an infinite one and I'll tell you where to start. This case here i was equal to 1. Some of them have a pattern, maybe you've seen the Fibonacci sequence before like 1, 1, 2, 3, and all the cool stuff that comes with that one. They have names like the Fibonacci one. You can also write this as a recursive formula. I can say that a1 is 1, I can say that a2 is 1 and then let you know that the a nth plus 1, the next term is found by adding the two prior terms. Do something then. This is called recursive formula. As with anything that's infinite, we're going to ask ourselves, "Is there a tendency of this to go anywhere as we go off towards infinity," or of course better said, "Is there a limit?" We can ask ourselves what the limit of the sequence is as n goes to infinity. Now this is important because it's a limit of a sequence. If you were to graph these things, you just get like a scatter plot on the graph. You just get dots where the values are. It's not a continuous function. It's similar but different. These are discrete points. We've lost all continuity here. We're going to ask ourselves, "What is the limit of the sequence? Does it converge? Does it diverge?" Let's just look at the examples I've put on the screen here. The constant sequence 1, 1, 1, 1, 1, does this thing as I go out towards infinity, as I look out to the tail of the sequence, does it go anywhere? We say, "Well, it's constant, of course, it just goes to one." This we'd say the sequence converges the limit of one. What about 1, 2, 3, 4? All these good counting number here to be integers. Does that tend to go anywhere? Well, we'd say this diverges, or you can actually say it goes to infinity, but this would be an example of a diverging sequence. This doesn't go anywhere. What about i over i plus 1? That's a half, two-thirds, three-fourths. You can use your knowledge of rational functions here to say, "Well, this is going to be the ratio of leading terms one over one. This is going to get to 9 over a 100 and 999 over 1000. This in the limit, goes to one. Notice there's no term that actually ever equals 1, but then the limit will go to 1. Last but not least the Fibonacci sequence, if you just keep adding 5 and all these numbers, it just gets larger and larger and larger. It's another diverging. The tools from Calc one about how to find infinite limits, they are back on the table. Such things as the limit laws or you can squeeze theorem or L'HÃ´pital rule. You can have a formula for these things, you can apply these rules. It's a good reminder about this, but the only difference between L and N is that N is an integer, and will see an example of that. Let's start off with that sort of warning first. I'm going to put this with lots of stars here and warning, find the limit as N goes to infinity of the sequence is important sequence, let's say sine of pi times N. Most students get this wrong when they see this for the first time because they're so used to working with functions. They haven't switched gears yet to working with sequences. Let me just draw a graph for you real quick of what this thing is doing. N is an integer. This is extremely important. The graph of this function, we haven't changed the amplitude it still oscillates between 1 and negative 1. It still goes through 0 and it oscillates. So a calc student who knows the graph of sine is going to look at this and say, "Well, wait a minute, I have this oscillating function. It bounces back and forth to between minus 1 and 1. So this limit must not exist." They'll put DNE or something. Doesn't exist that you want throw the word oscillating think would spell it right. That is wrong, super wrong, and that's the mistake to make. The reason is because n is an integer, n is one for a sequence or two or three, these are discrete points. What ends up happening is that if you think about it you're like, "What is this?" This is sine of Pi. This is sine of 2Pi, 3Pi, 4Pi. Like you get all the integer multiples of Pi. What's end up happening is because the sequence is not continuously, can jump your landing on all the values of 0 for this function and so really this sequence is the constant function 0. So it does go to 0 as n goes to infinity. That's the biggest difference between limits versus functions and limits. This as a function, if x could be any real number it would diverge. But as a sequence it converges. Let's see another example. Let's see a couple more of these. I'll put these on the screen. You can pause the video if you want, try to work them out. Tell me what's going on with them. Just remember, in all of these examples, it will be very clear from context that n is an integer. So does that change the behavior of the limit? Sometimes it will, sometimes it will not. First one, limit as n goes to infinity of n squared. Think of this as a sequence, try to get the mindset as a sequence, one squared, two squared, three squared. These things get larger and larger and larger. So like the function it diverges. We say this would diverge or go to infinity, one half to the n. This is a half, a fourth, an eighth. This thing gets small, how small? Well, it's like 1 over infinity small. That's going to go to 0. Last but not least, we have minus 1 to the n divided by n. What's going on with this one? This one here, you can use the squeeze theorem to see that this one is actually going to 0. It's bounded above by 1 over n. You can think of it as bounded above by 1 over n, and it's bounded below by 1 over n. It will squeeze there and come back at you. As n goes to infinity 1 over n, remember the graph of this starts off low and goes high, so if you going to low there are dots here, the function follows along. If the function converges, the sequence converges with it. This goes to 0 and this also goes to negative 0, but it's also known as 0. I have a number that's both larger than 0 and smaller than 0. So it limit has to go to 0. These are two sequences that converge and one that doesn't. Just one more we saw before, but let me give you one more here. Take the limit as n goes to infinity of 3n squared plus 1 over 2n squared plus 3. This is a rational function and if you view in terms of x and it converges, the degrees are the same, so we take the ratio of leading terms as 3 over 2. If the function converges, the sequence goes with it. The sequence has to follow. It's when the function diverges like our limit of sine of Pi over n, that the sequence can converge. Be careful there's two distinctions there. In this case here we get a converging sequence to 3 over 2. There's a couple of theorems that we can talk about, and these are also the same for functions. But here's the first one, if the sequence as the limit as n goes to infinity of the sequence in the absolute value. The limit as n goes to infinity your absolute value of a_n is 0. Then, the limit of the original function as n goes to infinity is 0. We saw this with the last one. I had minus 1 to the n over n as a sequence. Ask yourself, does the sequence get easier in the absolute value? When you have a minus 1 to the n when it has this thing that alternates, plus, minus, plus, minus, in the absolute value, this just becomes 1 over n. In that case, this 1 over n sequence goes to 0. So we have that the sequence in the absolute value converges to 0, which means by this theorem, this absolute convergence theorem here, the original sequence will converge as well. That's a nice little thing to use, it uses the squeeze theorem, just like we did the last one, but it puts it all together. Last but not least, we had this theorem also for functions, it turns out to be true again. This is the continuity theorem. This theorem says that if you have a sequence that converges to some limit and some function f, that is continuous, cts is my abbreviation for continuous. At the limit point, then you can move the limit inside. In terms of notations, the limit that then goes to infinity of f of the sequence is f of the limit of the sequence. This should look a little familiar when you study limits tenuous function. But the idea of course is, if you have a continuous function on the outside, you can bring the limit in. This one is normally seen with the function, if we're taking the limit as x goes to infinity, in this case though we'll use n since we're studying sequences of like sine of one over n. They say is, this is a really tough function, squeeze theorem doesn't work, but sine is continuous, so all the rules still apply. You can move the limit inside and you can work with them much easier limit of one over n. What does the graph look like? One over infinity, where does this go? This goes to zero. You're left with, in limit sine of zero is zero, because the sine function is continuous, we use lots of nice continuous functions that this works out nicely. Just one more example that you've probably seen something like it at some point when he studied limits. Same the idea. If you have not log on the outside, let's say 2n over n plus one, well, that's a tough limit. What is the log of this thing? Well, you can ignore the log completely. Natural logarithm is a continuous function, and you can move the limit inside and now study instead the easier function, the rational function 2n over n plus one. That's a rational function. It goes to the ratio of leading terms, so two over one, which is two and you're left with in the limit the natural. Those work for functions, they also work for sequences. A couple of definitions that I want, they're going to mimic their counterparts for functions. We say a sequence is increasing if each term gets larger than the one prior and that's for all n greater than one. We will also say that a sequence is decreasing if the terms get smaller. The next term is less than the prior one. Those two you've probably guess the meaning of it, but here's probably a new one for you. We say, a sequence is monotonic if either it is increasing sequence or decreasing. If it's one of the two. It's the umbrella term for one of these two. Again, this is for all. Increasing means I am going up the chain, I'm always increasing, decreasing means I am decreasing, every term is less than the prior one. Monotonic means I'm one of these two. Usually when you get new definitions in math besides memorizing definitions, you always want to give yourself an example of something that is and that is not. I think it's pretty easy to come up with increasing sequences from one, two, three, four, or decreasing as well. Maybe zero, negative one, negative two and work your way down. But what's an example of something that's not monotonic? Give an example of something that's not monotonic. Well, how about something that doesn't increase or decrease. I have to go up and down. How about one minus one? You see this one a lot. This one alternates plus, minus, plus, minus, and you abbreviate this as minus one to the n. This is the sequence minus, but you see this one a bunch. Another one could be a sine. Maybe sine of n. That will oscillate as well. Just follow the sine curve, but to the specific points. Some other definitions, again, you'd probably guess what their word is based on, what the word is. We say a sequence is bounded above if there is some m such that st, such that each term in the sequence is less than or equal to m. What's a nice term that's bounded above? How about minus one to the n? This is just one, negative one, one, negative one. There's lots of numbers that are bounded above, and for all the same reasons, you can have a definition and say that a sequence is bounded below if there's some, usually I use little m here, such that little m is less than all the term. Little m is less than. Is there a lower bound? Is there a small one? We don't say a min or a max because we have an infinite set. We tend not. Max or min are used for finite set. The question is, is this thing go off to infinity? Describes behavior. Can it get infinitely big, infinitely small? Or does it stay within some range? Now here's the definition that you want as well, what does it mean if you're both bounded above and bounded below? We say that it's bounded. If I say I have a bounded sequence, strictly bounded sequence, then a to the n is bounded above and below. You get both. That's important. If I just want one, I say bounded above or bounded below. But if I want both conditions, then I have to say bounded. Just as an example, let's bring back our friend here, a to the n is minus one to the n over n. If you think of a picture of this, what is this thing doing? It's one over n, it's going to go positive, negative, positive, negative. You have one and then maybe negative a half, a little smaller. Then it gets positive and negative. It alternates, but it gets so infinitely close because when we saw it was going zero, but it bounces back and forth. Is this thing bounded above? Well, yes, you can check that. The largest it's ever going to get is at one. An upper bound could be two, or three, or four. It doesn't matter what the upper bound is. The point is, is there some number that's larger? This is bounded above. It's also bounded below. The smallest this ever gets is at negative one. Put them together, and it's okay to say this sequence is bounded. This is a nice bounded sequence. The reason why we care about bounded sequence, because I have one more theorem for you. This is a nice one. Good words in it, every bounded monotonic sequence converges. Every monotonic sequence converges. Lots of things going on here with our new vocabulary. Bounded, remember bounded means above and below. Monotonic means strictly increasing, or strictly decreasing. My minus one to the n over n, This is bounded, but this is not monotonic. Now you say, well it still converges. That's okay because still converges just can't lose a theorem to say it's going to converge, so it's either strictly increasing, or strictly decreasing. Now imagine this for a second. I'm not sure it makes sense from a picture standpoint. If I had a sequence, let's just say a sequence that was increasing, so a monotonically increasing up, up, up, up, up, up, but it wasn't allowed to go to infinity. You always had to draw something a little bit bigger, but it wasn't allowed to go to affinity. Had some roof on this thing. Well, you can think of it as an [inaudible] or something. It's going to force this thing to converge. Try it. You can't draw function, I'm going to have you try, you can't draw a sequence that grows, and has a roof, and grows forever. If you do that, you're going to get that this thing converges. Every bounded monotonic sequence converges. It's a nice idea. We don't use this theorem to show things converge, but you can. If you're really stuck, sometimes you can show a theorem, if it's a really complicated one, you can show that the sequence is bounded, and monotonic, and then by this theorem, it converges. You don't know to what, you have to do additional work to say, to what, but you just get convergence out of the statement. Nice theorem to know, and you should make sense also for the picture. A couple important limits to know that are going to come up. Let's just write some of these down, and think of these as sequence limits to sequence. They come from their counterparts with Functions, and I'll give them to you, and we'll talk about maybe a couple of these, but hopefully some of these look familiar, or you have the idea how to solve these from single variable calculus. The limit as n goes to infinity of one plus x to the n over n. Here's a couple. Why did I bring this up? Natural log of n over n. What's going on with this one? If you plug in, if you try to plug-in, or to look at each limit. This is of the form infinity over infinity, and so now L'Hopital' rule should kick in your head. Remember if the function is going to converge, the sequence has to follow in line. Can we find what the function is doing? We want to work this out just to see this by L'Hopital' rule , I would want to take the limit as n goes to infinity, good little review here, of the derivative of the numerator divided by the derivative of the denominator. Clean that up. Of course you get the limit as n goes to infinity of one over n. It's like one over infinity bottoms getting large, that means the whole thing goes to 0. A little review L'Hopital' rule here. The common mistake that students do is they use quotient rule, or something bad like that. Don't do that use L'Hopital' rule. There's lots of L'Hopital' rule ones you could do this is just an idea. You'll see this one that comes up bunch though. What about this one, n raised to the one over n? What is going on with that one? Now that's not L'Hopital' rule at least form, to remember the trick for this one. You have to set y equals to n to the one over n and then you take the natural log of both sides, hopefully this is ringing a bell here, why you do that is that the exponent falls in front. Then you can write this as ln of n over n, that should look familiar, we tested this one, now you take the limit of both sides. You take the limit as n goes to infinity both sides and you get ln of y equals the limit as n goes to infinity. There's no n on the left side, so you just get ln of y of In of n over n. Well, hey, we just did that, that's 0. You have the natural log of y as 0, but we don't want the natural log of y, you want y. You take e to both sides, you get e to the 0, which is one. This little trick again, I just use this example because it builds from last one, but you can have a variable to a variable in a sequence, and to find limits that way you might have to introduce logarithms to make it fall. The last one 1 plus x and this is just the definition. This is the limit definition of e to the n. You can also work this out and show this using logarithms, but they usually define e to be this limit. e is so important, you should know when you're staring at it, sometimes students will get theirs and they'll just be like, "I don't know what this is ", but you should know what e is and limit. It comes up a lot. We're quickly going to leave the world of sequences, and go off to series, but this is the idea. Remind yourself some calculus 1 theorems, and some rules here, and then I'll see you in the next video. Alright, see you next time.