0:40

We will say that a process X_t is stationary or strictly

Â stationary if all of

Â its finite-dimensional distributions are invariant under shifts in time.

Â That is, X_t1 plus h,

Â X_tn plus h is equal in distribution to X_t1 and so on,

Â X_tn and this accords here should be fulfilled for all time moments t_1 and so on,

Â t_n and for any h larger than zero.

Â Another kind of stationarity,

Â the so-called weakly stationarity is the following.

Â So we will say that X_t

Â is weakly stationary if,

Â first of all, the mathematical expectation m of

Â t doesn't depend on t. It is a constant.

Â And on another side, covariance function,

Â recall its covariance between X_t and X_s,

Â depends on the difference between T and S. We can

Â express this mathematical concept in two ways.

Â So, first of all, we can say that this is the same as K_t plus h,

Â S plus h for all t

Â and s and for any h larger than zero or,

Â in other words, we can say that there exists

Â some function gamma which is called autocovariance function.

Â This is a function from R to R,

Â such that K of t,

Â s is equal to gamma at a point t minus

Â s. Weak stationarity has a lot of different names.

Â For instance, the same kind of stationarity is sometimes called

Â second-order stationarity or wide-sense stationarity.

Â Also, there is a notion of covariance stationarity.

Â All these notions are the same.

Â So we either speak on strict stationarity and discuss the properties of

Â complete finite-dimensional distributions or

Â we're speaking only about mathematical expectation and covariance function.

Â I will discuss the relation between these two kinds of stationarity a bit later.

Â But now, let me just give a couple of properties of

Â this function gamma because this function will play essential role in this lecture.

Â So which properties has this autocovariance function?

Â Let me just list some of them.

Â 4:08

First of all, let me mention that gamma at zero is non-negative.

Â In fact, gamma at zero is equal to

Â the covariance function between X_t and X_t for any time moment T,

Â and this is exactly the same as the variance of X_t.

Â So variance is non-negative and therefore,

Â the gamma function at zero is also non-negative.

Â Secondly, let me mention that absolute value of gamma of

Â t is less or equal than gamma at zero.

Â That is, the function gamma can take negative values but

Â cannot be very big negative numbers or very small negative numbers.

Â It just because absolute value of gamma at

Â t is in any case smaller or equals gamma at zero.

Â This is also a very simple fact because covariance function between X_t and X,

Â let me write here zero,

Â is in any case smaller or equal than square root of

Â the variance of X_t multiplied by the square root variance of X_0.

Â It just because of Gaussian on equality.

Â And since we have its variance X of t is equal to gamma zero,

Â this is the same as square root of gamma at zero

Â multiplied by square root gamma at zero and therefore, it's gamma at zero.

Â So basically, this property is also proven.

Â And the third property,

Â which I would like to mention now,

Â is that the function gamma is even.

Â 6:11

This is also nothing more than the properties of the covariance function,

Â because gamma at t is equal to the covariance between X_t and X_0.

Â This is the same as the covariance between X_0 and X_t.

Â And this is gamma as a time moment minus t. Therefore,

Â the function gamma is even.

Â So once more, the main properties of the autocovariance function are the following.

Â So, gamma at zero is non-negative.

Â Secondly, its absolute value for any t is smaller or equals in gamma at zero.

Â And the same properties as this function is even.

Â Let me now provide some examples on stationarity and

Â weak stationarity and discuss the relation between these two kinds of stationarity.

Â First, let me assume that the process X_t has a finite second moment.

Â You know that this assumption is very common in the context of stochastic processes

Â because if you write Aquarius function and you should write it in many situations,

Â then you immediately assume that this assumption is fulfilled.

Â And in this case, it turns out that if the process X_t is strictly stationary,

Â 7:51

I guess that this statement is intuitively very clear because if

Â X_t has finite-dimensional distributions which are invariant on the shifts in time then,

Â of course, these conditions should be also fulfilled.

Â But I don't want to proof it because intuitively it should be so.

Â And second statement for Gaussian processes,

Â exist the notions of strict and weak stationarity are the same.

Â So for Gaussian processes,

Â X_t is strictly stationary if and only if X_t is weakly stationary.

Â I think that this statement is also due in teaching if you

Â recall it because you know that in case of Gaussian processes,

Â the function of mathematical expectation as

Â a covariance function actually determines the distribution.

Â So, please keep in mind just the statements.

Â And now, I will proceed with examples.

Â The first example is the so-called white noise process.

Â 9:15

This process is defined for integer t,

Â sometimes they assumed only non-negative,

Â so 0, 1, 2, 3, and so on.

Â But sometimes, one can consider also negative values,

Â I'll write here plus minus 1,

Â plus minus 2, plus minus 3.

Â So, X t is drawn from a fixed distribution,

Â such that X t and X s are uncorrelated,

Â if t is not equal to s. That is mathematical expectation of X t is equal to a constant,

Â and it is normally assumed that this constant is equal to zero.

Â And variance of X t is also a constant sigma squared.

Â In this case, it is very common to denote white noise process as W N,

Â and then in brackets zero, sigma squared.

Â Okay. Correlation and covariance between X t and S s,

Â is equal to zero,

Â if t is not equal to s. This means that the covariance function can be

Â represented as sigma squared multiplied by

Â the indicator that t is equal to s. And therefore,

Â this covariance function can be represented

Â as autocovariance functions at point t minus s.

Â The autocovariance function in this case is equal to sigma

Â squared multiplied with the indicator that x is equal to zero.

Â So, we conclude that this process is weakly stationary.

Â In fact, we have the mathematical expectation is a constant,

Â which is equal to zero.

Â And also, there is this, the autocovariance function.

Â In general case, this process is not strictly stationary,

Â but there are some partial cases where it is so.

Â One partial case, is a case when X 1,

Â X 2, and so on,

Â are in fact independent identically distributed trend of variables.

Â And in this case,

Â the process X t is known as i.i.d noise.

Â There is no doubt i.i.d noise is also strictly stationary.

Â And now the partial case is when the process X t is a Gaussian process.

Â 12:24

Well, let me now provide some other examples.

Â Second example is a Random walk.

Â We have discussed this example when we started the theory of Markov Chains and

Â it was one of our first examples of a Markov Chain.

Â Just recall that a process S n is a Random walk,

Â if it is equal to S n minus one psi n. Where psi 1,

Â psi 2, and so on,

Â are a sequence of independent identically distributed random of variables

Â with the following distributions are equal to 1 and minus 1,

Â 1 with probability P,

Â and minus 1 is probability 1 minus P. And also it is assumed that S 0 is equal to zero,

Â there's no doubt we can also rewrite the definition in the following form.

Â S n is equal to psi 1, plus so on,

Â plus Xi n. Mathematical expectation of

Â S n is equal to n multiplied by the mathematical expectation of psi 1,

Â and it is equal to n multiplied by 2 p minus 1.

Â Therefore, if p is not equal to one half,

Â then mathematical expectation of S n depends on n. And in this case,

Â we get that the process S n is not stationary in the weak sense.

Â But you know that if the process is strictly stationary,

Â then it is weakly stationary.

Â And from here, where by simple logic concludes that if it isn't weakly stationary,

Â it is also not strictly stationary.

Â Therefore, if p is not equal to one half,

Â then the process is neither strictly nor weakly stationary.

Â So, to continue our consideration,

Â we should concentrate on the case p is equal to one half.

Â And in this case,

Â mathematical expectation of S n is equal to zero.

Â As for the covariance function,

Â let me take two integers values n and m,

Â and let me assumed that n is larger than m. Then we have here covariance

Â between S m plus

Â psi m plus 1 and so on plus psi n,

Â and S m. Since covariance function is a linear function,

Â we get here covariance between S m and

Â S m plus covariance between this sum psi m plus 1,

Â and so on, psi n and

Â S n. You know that S m is a sum of psi 1 psi m. And here,

Â we have a sum of psis which are with indices larger than m starting from m plus 1.

Â Since psi 1, psi 2, and so on,

Â are independent then [inaudible] distributed,

Â we have that this covariance is equal to zero.

Â Well, as for the first sum

Â here it is equal to the variance of S m,

Â and variance of S m is equal m multiplied by variance of psi 1.

Â 16:05

So, you see that this variance depends on m or minimum between n and m. Therefore,

Â what we have here is that the covariance function cannot be decomposed as a form of

Â some function at the argument n minus m. If you are not sure that it is so,

Â I advice you to consider time moments,

Â n plus h, m plus h as it is given here.

Â And then you will immediately realize that minimum cannot be

Â presented as a difference between the arguments.

Â So finally, we conclude that a Random walk is neither weakly nor strictly stationary.

Â But as is the case when P is equal to a half,

Â and in the case when P is [inaudible] number between zero and 1.

Â Let me continue. Our third example will be a Brownian motion.

Â 17:22

But it is stationary just because,

Â if you can see a variance on the Brownian motion you get

Â that variance is equal to t. This is just because you know that from definition,

Â B t minus B s has a normal distribution with a mean zero

Â and variance equal to t minus s. You should now substitute here s equal to zero,

Â then you get here B zero that is zero.

Â And here t minus s will be equal to t. So variance of B t

Â therefore is equal to t. But if the process will be weakly stationary,

Â then variance of B t shall be equal to

Â the value of the function Gamma as a time moment zero.

Â Therefore, it doesn't depend on t. You

Â know that this is not possible at all and therefore,

Â we conclude that Brownian motion is not weakly stationary.

Â We can also show the same fact from other sources.

Â For instance, we can consider covariance function,

Â it is equal to minimum between the arguments.

Â And as we have discussed previously,

Â minimum cannot be presented as

Â a different function of the difference between the arguments.

Â Let me now clarify this more precisely.

Â So, if you take K t plus h s plus h,

Â and assume that t is larger than s. So,

Â what we have here is actually s plus h. And if a process would be stationary,

Â then it should be equal to a minimum between t and this,

Â this will equal to s. And of course,

Â this equality is not possible for any positive h. Well,

Â finally, we conclude that Brownian motion is not a weakly

Â stationary and therefore it is not also strictly stationary.

Â In this case we can also employ the fact that for Gaussian processes,

Â strict stationarity and weak stationarity are exactly the same.

Â