Hello, today we'll speak about Gaussian processes. In the probability theory, the term Gaussian distribution is just another name for the normal distribution. And you know that this distribution is a very important kind of distribution and you know that it is applicable to many fields of stochastics. Gaussian processes, is just an extension of these ideas to the case of stochastic processes. And before we will speak about Gaussian processes, let me first introduce the notion of Gaussian vector. Let me just very shortly recall, that in the probability theory, which says that, the random variable xi has a normal distribution with parameters mu and sigma square. If this is an absolute continuous distribution, it has density of one divided by the square root of two pi, sigma multiplied with exponent in the power minus x, minus mu squared, divided by two sigma squared. Here, sigma is some positive parameter and mu can be any real number. Maybe you knows that, this distribution has a fully [inaudible] function. It is equal to exponent in the power, i u mu minus one half u square sigma squared. If you ever think about normal distributions in the probability theory, I just would like to ask you one rather intricate question. Assume that you have two random variables, x1 and x2, which both have normal distribution, for instance, standard normal distribution with parameters zero and one. And you know that the correlation between them is equal to zero. So, these two random variables uncorrelated. The question, which I would like to ask you, I would like to ask you whether it is true or not that in this case these random variables are independent? The correct answer is no. But in fact, you can characterize all situations from this property that they are uncorrelated, it follows as they are independent. If you know everything about this topic, you can just switch to the next section, but if not, then please listen this lecture. And actually in what follows, I would like also to a bit extend the definition of the normal distribution for some applications as well as for this lecture. It will be very convenient to say that a random variable, which is almost sure equal to a constant. This is thus equal to a constant with probability one is also normal. But in this case, we say is that the variance is just equal to zero. So once more, during this lecture we will say that a random variable has a normal distribution with parameters mu sigma whole squared if, either it has this density, and this case corresponds to the case when sigma is positive, or x is equal to a constant almost sure, surely, in this case, we will say that variance is equal to zero. So, let me now give a definition of a Gaussian vector. Definition is the following: We'll say that a random variable, a random vector x with components x1 so on xn is Gaussian if and only if any linear combination of its components has a normal distribution. That is, for all of lambda 1 and so on, lambda n, from R n, the sum lambda k multiplied by x k, k from one to n, has a normal distribution. Here it will be very important to note that according to our definition of the normal distribution constants have also normal distribution. In fact you can take all lambdas equal to zero, and in this case the sum will be equal to zero. If it do not act as a definition of the normal distribution, of course, those are also normal distributed, the there is no vectors which can to Gaussian. So actually this extension is quite important for the theory.