0:08

In this lesson, we'll begin a study of asymptotics.

Â What happens when a function gets very, very small or very, very large?

Â How fast does a function go to zero? How quickly to infinity?

Â Quantifying these rates of change will give us a new language, that of the big O

Â or asymptotic notation. >> Let's begin this lesson with a few

Â examples of functions, that grow to infinity but begin at different rates.

Â Consider, for example, a monomial x to the n.

Â For n, a positive integer, and the exponential function, e to the x.

Â Both become infinite as x approaches infinity, but at what rate?

Â Well, if we use L'Hospital's Rule, then, what we get when the derivative of the

Â numerator, we have n times x to the n minus 1.

Â The denominator, of course, remains fixed at e to the x.

Â Now, if we continue doing this, applying L'Hospital's Rule, then eventually, since

Â n is a positive integer, we will get something, times x to the 0.

Â That something is, as you can check, n factorial.

Â Whatever it is, it's a constant. Yet, the denominator remains e to the x.

Â What is this limit? It's 0.

Â What does that mean? It means that the exponential function

Â grows faster than any monomial, and indeed, than any polynomial function.

Â Likewise, if we consider, say, the natural logarithm of x, and let's raise

Â it to the nth power and compare it with linear growth with the monomial x.

Â Then, what happens? Well, if we apply L'Hospital's Rule, we

Â get in the numerator n times the log of x to the n minus first power, times the

Â derivative of log of x, which is 1 over x.

Â The denominator, well, the derivative of x is simply 1.

Â So we are left again with an indeterminate limit.

Â However, inducting eventually what are we going to get?

Â We'll get a constant in the numerator. That constant being once again n

Â factorial. The denominator remaining x.

Â 2:56

This too, goes to zero, which means that linear growth is faster than logarithmic

Â growth, even logarithmic growth raised to a power.

Â Now, put these together, this means that exponential growth beats polynomial

Â growth beats logarithmic or even polylogarithmic growth.

Â What else is there? Well there's an entire world or hierarchy

Â of functional growth as x goes to infinity.

Â Beginning to the constant functions that don't grow at all, moving up to

Â logarithmic growth, to linear growth, quadratic growth, cubic growth, all the

Â way to exponential growth. One question remains.

Â Is there anything beyond that? What can grow faster than exponential

Â growth? Well there are indeed worlds beyond that.

Â One simple example being factorial growth.

Â For now, what we want to do is consider the monomials, x to the n, for n bigger

Â than 0, and look at their growth. Both, as x goes to infinity, and as x

Â goes to 0. Well, it's not exactly growing there.

Â It's shrinking. But this is extremely interesting,

Â because if you look at the hierarchy of functions, as x is going to zero, we see

Â that linear is bigger than quadratic, is bigger than cubic, is bigger than

Â quartic, et cetera. That is, x to the n plus 1 is less than x

Â to the n, as x is going to 0. That is in contradistinction to what

Â happens as x goes to infinity, where we see that linear growth, is much slower

Â than quadratic growth, is much slower than cubic, et cetera.

Â That is, x to the n plus 1 is bigger than x to the n, as x is getting larger and

Â larger. All right.

Â What I'm going to do with that, well, let's take a look at an example.

Â The limit of 1 minus 2x plus 3x squared over 4 minus 5x plus 6x squared.

Â Let's first consider the limit as x goes to zero, what does that become?

Â Well this one's really simple. As x is going to 0, all of the higher

Â order terms in x are small, negligible, which means that we simply evaluate this

Â limit and get one fourth. The ratio of the lowest order terms in x.

Â On the other hand, if we look at this same limit as x goes to infinity, we

Â can't do the same thing. what we have to do is switch things

Â around a little bit. Factor out the highest order power of x.

Â In this case, x squared. Cancel that.

Â And what we see left over is 3 minus something small over 6 minus something

Â small giving a value of one half and the limit.

Â Now, if we compare these 2, we see that as x is getting small, it's the lowest

Â order terms that dominate. Whereas, if x is going to infinity, it's

Â the highest order terms that dominate. This duality is extremely important and

Â can cause confusion, so always pay attention to which way the limit is

Â going. What we really need is a good language

Â for controlling and describing this sort of growth.

Â This is going to be called big O and it has two forms.

Â One as x goes to infinity, the other as x is going to 0.

Â This is an extremely important and subtle definition.

Â You're going to want to pay attention to this.

Â We say that a function f is in big O of x to the n if, in absolute value, f of x is

Â less than some constant times x to the n. Now, we're not saying exactly what that

Â constant is, it's just some constant. And we're just getting an upper bound, we

Â don't actually care about the value of c, we just want to control the growth.

Â Now, there are two versions of this. One, we can say that this inequality has

Â to hold as x goes to zero, in which case, big O of x to the n consists of all

Â functions that approach zero, at least as fast as x to the n.

Â On the other hand, if we look at growth as x goes to infinity, then big O of x to

Â the n consists of those functions that approach infinity no faster than x to the

Â n. Both cases these are an upper bound.

Â This language replaces the higher order of terms that we have used in the past.

Â For example, if we look at the expansion of arctan of x.

Â What is that? The first term in the Taylor series is x.

Â All of the other terms are of higher order in x, specifically, we could say,

Â big O of x squared. Or, more specifically, we could say big O

Â of x cubed. That is a stronger statement.

Â It allows us to be more precise. Or, if we want to expand further, we

Â could say that arctan of x is x minus x cubed over 3 plus big O of x to the 5th.

Â Likewise, as x goes to infinity, we could look at a function like x times square

Â root of 5 plus 3x plus x squared. This is a complicated looking function.

Â But really it's just x squared plus some other stuff in the limit as x goes to

Â infinity. That other stuff is itself going to

Â infinity, but only at a linear rate. It is in big O of x.

Â If we wanted to be more specific, we could use the binomial series, to say

Â that this function is x squared plus 3 halves x plus something in big O of 1.

Â That is, something bounded, something in big O of x to the zero.

Â More generally, instead of talking about big O of x to the n, we could talk about

Â big O of some other function, g of x, if this same inequality holds for some

Â constant C. Let's look at a few examples.

Â In the limit, as x goes to zero, the function 5x plus 3x squared is in big O

Â of x, but it's not in big O of x squared. Sine of x is in big O of x, but not in

Â big O of x squared. Log of 1 plus x minus x, as you could see

Â from Taylor expanding, is in big O of x squared, but not in big O of x cubed.

Â 1 minus of cosine of x is in big O of x to the fourth, but not in big O of x to

Â the fifth. Square root of x is not in big O of x to

Â the n for any n bigger than 1. It does not go to zero very quickly at

Â all. On the other hand, e to the minus 1 over

Â x squared is very quickly going to 0, it's in big O of x to the n for all

Â positive n. As x is going to infinity, we have a

Â number of interesting examples as well. Arctan of x is in big O of 1.

Â It's bounded. That means it's also in big O of x to the

Â n for any positive n. x times square root of 1 plus x squared

Â grows quadratically most, but it is not in big O of x to the three halves.

Â The log of the hyperbolic sine of x, well, is really in big O of x.

Â It's got linear growth at most. It's not in big O of log of x.

Â It grows too fast for that. Hyperbolic cosine of x is in big O of e

Â to the x. It has exponential growth, but not

Â polynomial growth. Log of x to the 5th can be simplified,

Â and we see that it's really in big O of log x, as well as big O of x to the n,

Â for any positive n. Lastly, x to the x is not in big O of

Â constant to the x for any constant. And it's not in big O of x factorial.

Â This is even super factorial growth. Now there is a huge subject of asymptotic

Â analysis associated with big O. It has certain rules.

Â The ones that you need to know for this course are the following.

Â If you take something and big O of x to the m and multiply it by something in big

Â oh of x to the n, you're going to get something in big O of x to the m plus n.

Â Just like multiplying monomials together. This is true whether you're going to

Â zero, or infinity. On the other hand, if you add two

Â functions together. Something that has x to the m growth, and

Â something that has x to the n growth, then there's a difference.

Â As x goes to 0, your resulting function is in the minimum of m and n.

Â 14:11

As x goes to infinity, you're looking at big O of x to the maximum of m and n.

Â Let's look at this in the context of an example, as x goes to zero.

Â Consider e to the x over 1 minus x. Well, I'm going to write that as the

Â product of two functions. e to the x is 1 plus x, plus big O of x

Â squared. Using the geometric series, we see that 1

Â over minus x is also 1 plus x plus big O of X squared.

Â Now how do we simplify this? We pretend that everything in sight is

Â just polynomials and multiply them together, even the big O terms.

Â The first term is 1 plus X quantity squared.

Â The next term is 1 plus x times big O of x squared.

Â And, of course, we have two copies of that, so put a 2 out in front.

Â And our last term is big O of x squared, squared.

Â Alright, now, how do we deal with multiplying by big O.

Â Well, collect together the first term, and then notice that that x squared is in

Â big O of x squared. So we can forget about it.

Â Next, we have 2 times 1 plus x times big O of x squared.

Â Multiplying that out to 2 times big O of x squared plus x times big O of x squared

Â gives us what? Well, we don't care about constants with

Â big O, so forget about that. And x times big O of x squared is big O

Â of x cubed. Lastly, big O of x squared, squared, is

Â big O of x to the 4th. Now as x goes to 0, those higher order

Â terms all fall under big O of x squared, and we get a simplification.

Â If we want to make it more interesting, say, raising everything to the power

Â alpha, then we could use the binomial series, to say, that this is 1 plus 2

Â alpha x plus, well, there's some other terms, but they're all big O of x

Â squared. This lesson on asymptotics dealt with how

Â to deal with the growth rates of change of a function.

Â And though we're at the end of Chapter 1, this provides a perfect segue into

Â Chapter 2, where we'll codify what we know about rates of change into the

Â notion of a derivative.

Â