0:26

>> In this chapter, and indeed in this course, we've seen several instances of

turning functions into sequences through some sampling process.

We now turn to the problem of reversing that machine,

of converting sequences into functions.

One mechanism for doing so is the power series.

A power series in x is simply a series that has

the variable x inside of it as a monomial term.

Something of the from sum and goes from 0 to infinity.

Of a sub n, x to the n, where now a sub n is considered a coefficient.

We're going to think of the notion of a power series as a machine or

an operator that converts a sequence,

a sub n, into a function, f of x.

Let's begin with some simple examples.

If we have a sequence, all of whose terms are 0,

except for one in the nth slot,

then the power series is simply the monomial x to the n.

Now if we have a finite sequence,

in the sense that only finitely many of the terms are non 0,

then this will have a power series that is a sum of monomials with

coefficients in front of them that eventually terminates.

We call that a polynomial.

Now, if we have something that does not eventually end in 0s,

we might have to be a bit more careful, but this isn't so mysterious.

If we have the sequence of all 1s, we recognize that the power series for

that sequence is simply the geometric series.

And this gives us the function 1 over 1-x,

at least when x is less than 1, an absolute value.

Other sequences can be easily seen to correspond to interesting functions,

though one may have to do a bit of algebra to see it.

If we look at certain sequences, we'll recognize some old friends.

Log of one+x, or e to the x.

These correspond to certain power series.

Now, what if we have a sequence that does not go to 0.

Say, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, etc.

Will that converge to anything?

Well, of course, this isn't so bad.

This one corresponds to the function 1 over quantity 1+x squared,

as you can see from squaring the geometric series.

In fact, we can have sequences that grow rather large but

which have power series that converge to something very simple.

Now, what if we just take any old interesting sequence of coefficients.

What will that converge to?

Sometimes it's a bit of a mystery.

Let's look at the Fibonacci sequence, and

let us denote by script F the power series whose coefficients are F sub n,

the terms in the Fibonacci sequence.

Now, if we were to multiply this power series by x what would happen?

It would give us a new power series associated to a shift,

a right shift, of the Fibonacci Sequence.

If we were to do it again, then we would shift over to the right yet once more.

And the reason for doing this is to line up the coefficient to

match the recursion relation that the Fibonacci Sequence satisfies.

Fn+2 = Fn + 1 + Fn.

4:46

Now if we take a look at the coefficients in front of the monomial term,

x to the n, for these three power series that we have written down.

Then we see from the recursion relation the relationship between them.

The first is the sum of the latter two.

And so if we were to subtract x times script F and

x squared times script F from script F, what would we obtain?

Most of the coefficients of the monomial terms are zero,

because of the recursion relation.

When we add up these power series, the only thing that

is non-zero is in the constant term or we get F naught and

the first order term where we get F1x- F naught x.

Now because we know the exact values of F naught and

F1, we see that, on the left, we have F- xF- x squared F.

On the right, we have simply, x.

Solving for script F gives us the power series

formula that script F, the power series,

the Fibonacci sequence is equal to x over 1- x- x squared.

Now, that's what it is, but what does it mean?

What do we do with this?

These functions that generate the power series have within them,

all sorts of interesting questions.

They tie together recursive properties, enumerative properties,

approximation properties and convergence.

We're going to focus on issues of convergence.

Consider a power series F of x, sum of a sub N, x to the n.

The following theorem holds.

For some capital R between 0 and infinity,

perhaps achieving either of those values f of x has

the following convergence behavior.

It converges absolutely, if x is less than R in absolute value,

it diverges if x is bigger than R in absolute value.

Now this R is a very special number associated to the power series.

It's called the radius of convergence.

Let's see, how would we prove this theorem?

What tool would we turn to, to determine convergence?

Well, let's start with our friend, the ratio test.

Here, what we need to do is compute rho.

The limit is n goes to infinity of the ratio,

the n plus first term to the nth term.

Now, what are the terms?

Well, here they are a sub n times x to the n.

So taking the ratio of incident terms will tell us what we need.

Now, we have to be careful to take the absolute values

since the ratio test requires a positive sequence.

Okay, so doing that what do we see?

We get some cancellation with the xs and we can rewrite this as

the limit as n goes to infinity of the absolute value of a sub n + 1 over

the absolute value of a sub n times the absolute value of x.

Now, if that is less than 1, we have absolute convergence.

If that is bigger than 1, we have divergence.

So where is this capital R coming from?

Well, it is precisely the reciprocal of this limiting coefficient.

Since if we divide by this on both sides,

we get the appropriate answer.

When rho is less than 1, that's the same thing as

saying that x is strictly less than R in absolute value.

That means we have absolute convergence.

When rho is bigger than 1 that is the same thing as saying x is

greater than R in absolute value, and that means divergence.

So what matters is this radius of convergence capital R.

The limit as n goes to infinity of absolute a sub n over a sub n+1.

Be sure to remember that, and be sure to remember that it's the nth

term over the n plus first term, not the other way.

9:49

Now, this Radius of convergence is called a radius of convergence because

it's telling you the distance away from 0 past

which your power series is going to diverge.

But within that, you're fine.

Now, you do have to be careful, we've said nothing about the end points of

the domain, because of course, the ratio test tells us nothing in that case.

So you're going to have to be careful and check explicitly at those end points.

Let's see a simple example if we consider f of x.

The sum over n -1 to the n, square root of n, quantity 2x to the n.

Then our coefficient a sub n is -1 to the n,

square root of n, 2 to the n.

To compute the radius of convergence, we take the limit of

the a sub n divided by a sub n + 1 all in absolute value.

That gives us the limit as n goes to infinity of square root of n,

2 to the n, over square root of n + 1, 2 to the n + 1.

That is, as you can easily see, one-half.

Therefore, within one-half of 0, we have absolute convergence.

Outside of that, we have divergence.

What happens at the end points?

Well, we need to check what happens at x = -one-half if we plug

in a value of -one-half in 4x, what do we get?

We get the sum over n of square root of n.

Since the -1 to the ns cancel.

That is a divergent series.

So at the left hand end point we have divergence.

At the right hand end point, at x= one-half,

you can easily see that we get the same sort of series that,

too, is going to diverge by the nth term test.

11:58

Now other series that we've seen throughout this course have similar

behavior.

If we look at the geometric series,

this is the power series where all the coefficients are equal to one.

The radius of convergence in this case equals 1, and

this diverges at both endpoints at plus and minus 1.

If on the other hand we consider the power series with coefficient 1 over n,

then goes from 1 to infinity.

Then what do we see?

This, too, has a radius of convergence equal to 1, and

at the right hand end point we obtain the harmonic series, which certainly diverges.

But at the left-hand end point, that x = -1,

we get the alternating harmonic series.

That converges, but it does so conditionally,

not absolutely.

If we consider the power series x to the n over n squared,

where the coefficient is 1 over n squared,

this also has radius of convergence equal to 1.

But it converges at both end points and

does so, therefore, absolutely.

13:23

Some power series come to us in a slightly different form,

that of a shifted power series, something of the form sum over n

of a sub n times quantity x- c to the nth.

This of course is just a power series shifted over by c units.

In this case, if we compute the exact same radius of convergence,

then what we will find is that this shifted power series converges

when x-c is less than R in absolute value.

That is when x is within R of c.

R is thus a distance to the center, c, of the shifter power series.

For an example of a shifted power series,

consider the sum of 3x-2 to the n over n times 4 to the n.

Now this is not in the form of x-c to the n, so

we're going to have to do a little bit of algebra to get it into that form.

Claim that by factoring out a three-quarters to the n, and

absorbing that 3 to the n term from the denominator into the numerator.

We get the equivalent sum of 1 over n times

three-quarters to the n times x -two-thirds to the n.

That means that this shifted power series is centered, and x = two-thirds.

Now to determine the radius of convergence,

we isolate the coefficient a sub n as 1 over n times three-quarters to the n.

We compute the radius R is the limit then goes to infinity

of a sub n over a sub n + 1.

That yields, after some algebra, the limit of n + 1 over n times

four-thirds, which is precisely four-thirds.

Therefore, the domain of absolute convergence is all x

within four-thirds of two-thirds the center.

Now, what happens at the boundary points?

Well, on the left, we have x = -two-thirds.

That is, two-thirds minus four-thirds.

Substituting in x = -two-thirds, we obtain,

after a little bit of cancellation, the alternating harmonic series.

That, of course, converges.

When we substitute in the right hand end point, that is,

x = two-thirds + four-thirds or 2.

Then we obtain, substituting in a series which is the harmonic series.

This, of course, diverges, so

that we have conditional convergence on the left, divergence on the right.