[MUSIC] So far we considered only random variables with finite number of possible values. However, it is possible that we deal with some phenomena in which the corresponding variable can take infinite number of values. This is also possible to consider. Again, we'll begin with simple example. Let us return to our coin tossing experiment. But now let us assume that we toss the coin not a fixed number of times, but we toss the coin until we see first head. So our experiment, Toss a fair, for simplicity, coin, Until, First head. So I basically toss a coin. If it's head, it's over, if it's tail, I toss this coin again. If it's head, it's over, if it's tail I continue, and so on until I have a head. Sooner or later I will have a head. It is literally impossible to get an infinite sequence of tails, at least if my coin is fair. So I can introduce random variable X, which is number of tossing, Until first head, Including the head. So let us try to find a distribution of this variable. To find the distribution, we first have to say which values this random variable can take. It is obvious that X can take any positive integer value. Let us find the corresponding probabilities. What is the probability of that X equals to 1? It means that we have head at the first tossing. The probability of this event is one-half. What is the probability that X equals to 2? We can consider the following diagram. At first tossing, we can get either head with probability 1/2, or tail, with probability 1/2. If it's head, then it's over, but if it's tail, then we have second tossing. And again, we can get head at the second tossing or tail. And again the probabilities are 1/2, 1/2. So to get this head means that the x equals to 2, because we have two tossings before the first head. So the probability of this event is 1/4. And so on, we can continue this tree in the following way. And at every step the probability of the corresponding event will be equal to one-half to the corresponding power. For example, for X equals to 3, the corresponding probability is 1/8, which is 1/2 to the power of 3, and so on. For value K, we have here probability 1/2 to the power K. Now we have to check that these probabilities sum up to one. And it is rather easy, because it is a geometric progression. 1/2 + 1/4 + 1/8 + and so on + 1/2 to the power of K + and so on, this is actually 1, which we can simply illustrate by the following picture. If we have a square, And here we have 1/2 of the square and this is 1/4 of the square, and this is 1/8 of the square and so on. If we divide the rest part by 2, then we have the following term in the sum. And it follows immediately from this picture that the overall sum equals to one. So we see that these numbers define probability distribution, even despite the fact that we have infinite number of them. Now let us give some definitions. So if X is a discrete random variable, Then two possibilities can take place. Either X takes on a finite number of distinct values. This is the case that we considered mostly during this lecture. And in this case, the corresponding probability distribution is a finite set of values and their probabilities. But it is also possible that X takes not finite, but infinite number of values. If we discuss distinct random variables, infinite means countable. It means that we have a kind of infinite sequence of possible values. In this case, the distribution is given by a pair of two infinite sequences of values and probabilities. Here we need that the finite sum of this, P1, P2, and so on, Pn to be equal to 1. Here we need the same from the infinite sum of these values, just like in the example before when we considered geometric progression. In both cases, the i should be greater than or equal to zero, for every i. Note that we can easily extend the definition of expected value to this case. Recall that in this case, expected value of variable X is a finite sum of this kind, value times probability. And here the formula is the same, we just replace finite sum with infinite sum, just like this. Note that in contrast with finite sums, infinite sum can be nonexistent. So for random variables with infinite number of values, it is possible that expected value is nonexistent. Let us discuss an interesting example of random variable that doesn't have expected value. [MUSIC]