0:33

This means, just recall that there is only one class of equivalence and all elements from

Â this class are recurrent and aperiodic.

Â Then, the following limits exists.

Â Limit Pij of n,

Â probability to access state number j from a state number i within n steps.

Â Where we take limit as n is tending to infinity.

Â And these limits will be denoted by Pj star.

Â An important point is that this limit do not depend on i.

Â So, it isn't important at all from which state you start.

Â Moreover, the theorem states that these limits are strongly positive and

Â also the sum of Pj star j from one to M is equal to one.

Â So, it is a vector P star equal to P1 star and so on, PM star.

Â These are probability distribution with M elements.

Â So, the main part of the theorem is the instance of this limit and it turns

Â out that the limit included is are very special numbers for Ergodic Marcov chain.

Â The meaning of these numbers is clear from the following corollary.

Â So, corollary has two parts.

Â The first part which states that if

Â numbers P1 star and so on PM star are obtained from this formula,

Â which is boxed now, then the distribution P star is a stationary distribution.

Â 2:47

That is, if you multiply P star by matrix P,

Â then you get exactly Pi star.

Â And the second part of this corollary is that if you take a limit of the probability,

Â set a Markov Chain Xn

Â equal to j and to take the limit as n is tending to infinity.

Â Then you will get exactly Pi j star

Â and this limiting value doesn't depend on what was the original distribution,

Â what was that Pi zero.

Â So, you see that Pi star is a very special distribution with Ergodic Marcov chain.

Â And let me first show why these corollaries are fulfilled.

Â Basically it isn't important at all in this corollaries that a chain is Ergodic,

Â but it's important that the distribution Pi star is obtained via this formula.

Â In fact, there are some examples when

Â such formulas are fulfilled but the chains are not Ergodic.

Â This is also possible.

Â So both corollaries once more is only because of this limit exists.

Â Let me now show why these corollaries are fulfilled.

Â So, let me prove them.

Â I will start with the first item.

Â So I should show that factor in the left hand side in the right side coincide.

Â Let me take some i from one to M and

Â show that the element number i from left and right hand side coincide.

Â Okay, and what was written on the left hand side is Pi star P element number i.

Â Here we have a product of a vector and a matrix.

Â Therefore, it's the sum j from one to capital M,

Â Pi star J multiplied with Pji.

Â 5:43

Now, I can put limit outside this sum,

Â and we'll get that here we have the limit as n is turned into infinity.

Â Sum j from one to M. Pkj of n multiplied with Pji.

Â Here, we have a product of two matrices,

Â the first matrix is capital P(n) and the second matrix is

Â matrix P. You know that P and then bracket is the same as P n,

Â therefore it's the same as P n plus one,

Â and P n plus one is the same as P n plus one brackets.

Â This is according to the theorem which we have already proven.

Â And due to this representation,

Â we get that here we have an element of the matrix

Â P(n+1) element number K i.

Â And here we can use this formula, boxed formula,

Â once more until we will get this is equal to Pi star.

Â Therefore, because it started with element,

Â was the vector Pi star P and finally got the element of the vector Pi.

Â Therefore, these two vectors coincide as the first item is proven.

Â Now, let me continue with the second part of this corollary.

Â Now, let me prove the second part of this corollary.

Â According to our definitions,

Â we shall show that the limit of (Pi)j^n,

Â as n stands to infinity is equal to (Pi)j^*.

Â And here the initial distribution,

Â (Pi)j^0 can be arbitrary.

Â So we can start with any distribution and will get as limiting

Â distribution the distribution (Pi)J^*.

Â It's equal to the limit as n standing to infinity of the sum

Â (Pi)k^0 multiplied by (Pi)kj of n. Here the sum was

Â taken by k from 1 to capital M. Here we use the fact which was shown before that

Â a vector Pi^n is equal to the vector

Â (Pi)^0 multiplied by metrics P and (n) brace.

Â Once more as this fact was shown before,

Â n... What to do now,

Â we will change the places of the limit and the sum.

Â And since (Pi)k^0 also do not depend on

Â n became process limit before the probability of (Pi)kj^n.

Â This means the tool gives the following sum,

Â k from 1 to M by (Pi)k^0 multiplied by

Â the limit as n stands to infinity, (P)kj of (n).

Â Now let's return to this formula which is boxed.

Â And get that this limit

Â is equal to (Pi)j^*.

Â You see that this value of (Pi)j^* do not depend on k at all.

Â Therefore, we can put this (Pi)j^* outside the sum.

Â And finally, we will get that the sum is equal to (Pi)j^* multiplied by

Â with the sum k from 1 to capital M, (Pi)k^0.

Â And since, (Pi)1^0 and so on, (Pi)M^0,

Â in some probability distribution;

Â the sum of all those elements is equal to one.

Â And to get that this expression is equal to (Pi)j^*.

Â So finally, we conclude that the limit of (Pi)j^n under

Â any choice of initial distribution is equal to (Pi)j^*.

Â And this observation completes the proof.

Â So once more, what is important in this corollary,

Â that the values (Pi)j^* obtain as a limit of (P)ij^n.

Â All other things are not important at all.

Â Xt can be or not Ergodic.

Â And also it isn't important that this values are positive.

Â It can be any probability distribution.

Â And, what is really important here is that

Â this distribution is obtained as a limit of such elements.

Â Now I would like to provide an example how one can

Â use this corollaries in particular cases.

Â We consider as a full length example,

Â assumes that our Markov chain has two states,

Â and the probability to access 2 from 1 is equal to 0.8;

Â to go back is 0.6;

Â from 1 to 1 is 0.2,

Â and from 2 to 2 is 0.4.

Â So this Markov chain has a falling transition metrics

Â namely it's 0.2, 0.8, 0.4.

Â Oh sorry, 6, and 0.4.

Â Okay. Now we conclude that

Â this Markov chain is Ergodic because it consist of only one class,

Â all elements are current, and aperiodic.

Â So what can implies the Ergodic theorem,

Â so the limit of the corresponding (n)

Â state transition elements exist and all of them are positive.

Â And also both corollaries are fulfilled.

Â Okay, let me find these limits from the first corollary namely,

Â if I'll denote the elements of the vector (P)^* by (a b),

Â I can find this (a b) from the system which appears if I will write this

Â as the condition of stationary P by *P is equal to (Pi)^*.

Â The following form, so,

Â (a b) multiplied by 0.2,

Â 0.8, 0.6, 0.4 is equal to (a b).

Â This is basically a system of two equations with two unknown variables a and b.

Â So have 0.2 multiplied by a,

Â plus 0.6 multiplied by b is equal to a.

Â And also, 0.8 multiplied by a,

Â plus 0.4 multiplied by b is equal to b.

Â If so this system we'll get that a is equal to 3

Â divided by 7 and b is equal to 4 divided by 7.

Â