Previously, we discussed that expected value of sum of two random variables is equal to sum of expected values. But for example, for product of expected values, it is not true. However, if two random variables are independent, then a very natural relation between expected value of their product and product of expected values takes place. Let us consider two random variables; X and Y. Let us assume that X and Y are independent. Then expected value of their product is equal to product of the expected values. Let us prove this. Let us assume that X takes the values x_1 and so on, x_m, and Y takes values y_1 and so on, y_n. I also have to introduce the distribution of X and distribution of Y. So let me assume that probability that X equal to x_i is equal to p_i and probability that Y equal to y_j is equal to q_j. I have to use different letters here to distinguish two distributions. Now, if X and Y are independent of each other, then probability that x equals to x_i and at the same time, Y equals to y_j is equal to product of probabilities X equal to x_y times probability that Y equals to y_j. This is equal to p_i times q_j. This is Jen probability distribution that is obtained in this way due to independence of x and y. To find the expected value of this product, we have to take into account all possible values that can take x and y and their probability. So expected value of product x times y can be found in the following way: it is a sum from i and sum over j from one to n of probability that x equals to x_i. At the same time, y equal to y_j times values of the corresponding random variables, x_i and y_j. We can rewrite it using this relation. We can say that this is equal to the sum and we put this p_i q_j here. We have to use double sum here and here because we want to take into account all possible combinations of x_i and y_j's. Each of this combination gives us new possible value of their product. So we have to take into account their probabilities and their values. But now, we can separate this double sum into some parts. Indeed, we see that in this sum, these values p_i and x_i does not depend on the variable of summation g here. They are constant from the point of view of this sum. So we can move this constant out of the summation sign. So we have the following thing. Now, we see that this number does not depend on i and it means that we can move it out of this summation. So we get the following thing. We have product of two sums. If we look at this sum, we see that this is expected value of variable x. Indeed, we have values of x multiplied by probabilities and we have a sum over all possible values of x. In the same way, this is expected value of random variable Y. So we see that this is a product of expected value of X and expected value of Y. This finishes the proof. We see that the crucial point in this book is the fact that we can express this probability as a product of two probabilities like this. This allows us to move these terms out of this sum and replace this double sum with product of these two sums. If this condition is violated, this proof doesn't work. We will use this fact in the future when we will discuss properties of variance of the sum of two random variables.