And it's actually very straightforward.

So what we're going to do is just pick a value k0 and

split into two parts for values less than k0 and values bigger than k0.

Now, remember, for small k is where it's significant, for

large k, it's going to be negligible.

And going from the [COUGH] approximations that we developed before,

for example, if you take k0 to be N to the two-thirds,

then the tail's going to be exponentially small.

I don't want to do the detail of that.

And then the tail's exponentially small and not only that,

we have a good approximation of the sum n for k less than k0.

For all of those values, it's not far from e to the -k squared over 2n.

That's the approximation that we just developed.

It's very close to that, actually.

But that one also for large K is going to be exponentially small.

So we might as well just extend the range back to be an infinite sum,

because the tail's also exponentially small for that one.

And now that sum, we can just approximate with an integral.

And if you do that, you get square root of Pi N over 2.

And again that's the true value of doing asymptotic calculations.

This function Q(N) while it's a precise description of

some mathematical quantity that's going to be really difficult to compute.

But if you notice square root of pi N over 2,

then that's something that you can work with.

And we'll be coming back to applications that use this function later on.

That's an introduction to bivariate asymptotics.

Now I want to finish up by giving a few exercises that you might do before

the next lecture to test your understanding of asymptotics.

So the first one has to do with where I started with is,

how small is exponentially small?

And this is just trying for a couple of different values of alpha and beta,

comparing the values of alpha vn and beta vn.

And really showing that alpha vn,

the larger one is going to be really good [COUGH] approximation.

So here's another one.

An opportunity to go through those calculations that I

did it in the last section.

There's another Ramanujan function, actually, there's three famous ones.

But there's another one called the P function.