[MUSIC] To measure how far another variable deviates from its expected value, we will use variants. Let us return to an example about Bob who plays a lottery and bets $1 in this lottery and can win $5. So our lottery was defined as the following. With probability 0.1, Bob wins $5 and with the remaining probability, Bob lose $1. I will denote Bob's payout as XB. Let me recall that the expected value of XB equals to negative 0.4. Now, the question is how far the value of XB from its expected value? So we're interested in the difference between XB and expected value of XB. Again, as in previous example, we can find the distribution of this random variable. To do so, we have to add a row to this table, XB minus expected value of XB. For example, when XB takes value 5, this value takes value 5 minus negative 0.4, it means it is 5.4. What should I write here? If XB takes value negative 1, then this thing takes value, negative 0.6. Because it is negative 1 minus negative 0.4. So we see that this difference is again a random variable. We are interested in how far XB differs from its expected value, on average. Probably, you want to put expected value here. You can use this table to find this expected value, what is the answer? Indeed, the answer is 0. We see that this difference can be either positive or negative. And when we multiply these numbers by these probabilities, we'll have 0. Because positive deviations will exactly cancel out the negative deviations. So we want to make positive and negative deviations to be of the same sign, so to kill the negative sign here. We can do it in different ways, for example, we can consider absolute value of this difference. But mathematically more convenient to use not absolute value, but a square. So currently, we are interested in the following thing. Expected value of square of difference between the variable XB and its expected value, This thing. This thing is called variance of XB by definition. Let us find variance of XB in this case. To do do so, we have to find squares of these numbers. We have 29.16 here and we have 0.36 here. And then we have to multiply these numbers by these numbers, this number by this number and make a summation. So in this case, we have, And the answer is 3.24. You'll probably have to use calculator to check these calculations, but I believe that they are true. So this number gives us what we want. These numbers show the average deviation of the value of XB from its expected value squared. This thing is actually the same as we discussed before when we discussed squared error. If we use expected value of XB as a prediction of XB, then this expected value is just an expected value of squad error. It is in a sense, the best possible squared that we can get. Let us give a general definition. If we have some random variable X, It's variance, Which is usually denoted by Var X is the expected value of the difference between X and expected value of X squared. We will use variance to distinguish between random variables which values are close to their expected values so, their range is narrow and variables are which range is large in probabilistic sense. It means that on average, they take values that are far from the expected value. [MUSIC]