[MUSIC] Joint distribution of two variables gives us information about a distribution of both of them. But what if we are interested only in one variable? For example, let us return to an example with two tossing of fair coin. Let us think about this joint distribution. Let us assume that we are interested only in x, what is the probability that x=0? To find it, we have to decompose this event into two events. Actually, x can be equal to 0 and z equal to 0 as well, the probability of this event is one-fourth. But also, x can be equal to 0 and at the same time z is equal to 1, and the probability of this event is also one-fourth. So to find probability of event that x equal to 0, we have to find a sample list values, and get one-half which is quite expected. We can do it in the same way for any joint distribution, and yet so called marginal distribution. This is a distribution of one variable. Let us assume that we have a pair of random variables, X, That takes values x1 and so on, xm. And we have random variable Y that takes values y1, and so on, yn. And we have joint probability distribution, meaning that we know probabilities of events like this, X = xi, and at the same time, Y = yj, here i changes from 1 to m, and j changes from 1 to n. Now what if we want to find the probability of that X=xi? We can prove that this probability is equal to sum of the probabilities of this kind. Let me denote these probabilities by Pij. Then this probability is equal to sum, Of Pi 1 +, and so on, + Pin. Or in different notation it is the sum like this one. Let us prove this statement To make a proof, let us decompose this event in a union of events of this kind. If X=xi, Y can take all possible values y1 and so on, yn. So it means that the probability that X=xi is equal to the probability of the following union, X=xi and Y=y1 We have union of several events. And the last event is that Y is equal to yn. So Y have to be equal to 1 on these values. So at least one event of this kind will take place. And we also can see that these events do not intersect each other. They are mutually exclusive, because Y cannot take different values at the same time. So we can apply a rule that the probability of union of mutually exclusive events is equal to sum of probabilities. Each of these probabilities is equal to our probabilities that are denoted my Pij, so we have the sum we expected. This finish the proof. Now, let us apply this statement on to distributions that we discussed before. Let us find marginal distribution for variables x, y and z are using these joint distributions. For example, for this distribution if we find marginal distribution of X, we have to find sums of these values. And it was equal to one-half here and one-half here. Also, if we want to find marginal distribution of Y, we will have to sum these values. And we have one-half here and one-half here. This is in the agreement of previously discussed individual distributions of these variables. Now let us look at this table, we have different numbers here but if we find marginal distributions. We see that for example for variable X, we have the same marginal distribution. One-half here which is equal to the sum of these two one-quarters, and one-half here in the same way. We can also find distribution of Z. To do so, we have to find sums of these values. So we have one-half here and one-half here. Now we see that having the same marginal distributions here and here, we have different joint distributions. And so it means that that in general, if you know marginal distribution, you cannot extract joint distribution of two variables, because you don't know how they interact with each other. However, if you know that two variables are independent of each other. Meaning that if you know something about the value of one variable, you don't know anything new about the value of other variable. Then you can extract joint distribution just by looking at these marginal distributions, we'll be able to discuss it later. [MUSIC]