We are now ready to define the notion of the Hidden Markov model. And to do this, we will turn the dealer into a machine with this k hidden states. This machine will proceed in a sequence of steps. And for example, in the case of coin flipping, these hidden states will correspond to fair or the biased coins. In each step, machine emits one of the symbols from the alphabet. In the case of coin flipping, the symbols machine emits will be head or tails of course. And while in a certain state, the machine makes two decisions. Which symbol will I emit? This is the first question it has to answer. And which hidden state will I move to next? This is the second question it tries to answer. And it uses probabilistic arguments to make these two decisions. Now, why the word hidden in the hidden Markov model? An observer can see the emitted symbol of an HMM, but doesn't know which state the machine was when it was emitted this hidden symbol. And the goal is to infer the most likely sequence of hidden states of an HMM based on the sequence of emitted symbols. Formally, the Hidden Markov model is defined by the set of the foreign four objects. First, an alphabet of emitted symbols. Second, a set of hidden states of the HMM. In the case of the crooked casino, an alphabet of emitted symbol will be head or tails. A set of hidden states will be fair or biased coin. Next, a matrix of transition probabilities which describes the probability of moving from state l to state k. In our example of the Crooked Casino, this is the the matrix. For example, the probability of moving from state F to state F is 0.9, but the probability of moving from state F to state B will be 0.1 because the probability of is 0.1. And finally the Hidden Markov model is described by matrix of emission probability. That describes the probability of emitting symbol b when the HMM is in state k. For example, for the Crooked Casino, this matrix is shown on this slide. The probability of emitting head from the first state is 0.5, but probability of emitting head from the biased state is 0.75. We will represent HMMs by HMM diagrams. In the case of Crooked Casino, HMM has two states, F and B. And there are all possible transition between these states, and these transitions are described by the transition probability matrix. This HMM needs two symbol, and H and T, and the probability of emitting this symbol is described by the emission probability matrix. To study Hidden Markov models, we will define the notion of a hidden path. The hidden path is a sequence of states that the HMM passes through. And we will note it as Pi equal pi 1, pi 2, pi end. We will also define probability of X pi which is the probability that an HMM follows the hidden path pi, and emits the string X. At the slide, you see an example of string X. And hidden path pi our goal is to compute probability x, pi. Note that if we sum up probabilities of x pi through all possible string x and through all possible passes P, the result will be of course 1. So there are how many sum signs as sum? It is 2 to the power n choices of x multiplied by 2 to the power n choices of pi. To compute probability x pi, we will need to introduce one more notion, which is probability of x given pi. Probability of x given pi is a conditional probability than an HMM emits the string x after following the hidden path pi. Please note that in this case, some of all possible emitted string of Pr(x|pi) = 1. Please notice the difference between these two sums. The sum on the top is the sum of 2 to the power and multiplied by 2 to the power and. The sum on the bottom is the sum of 2 to the power and corresponding to all possible emitted strings x. We are now ready to compute the probability that an HMM follows the hidden path pi and emits the string x. To compute it, we need to define the notion of probability of xi given pi I, which is the probability that symbol Xi was emitted from the state pi I. And this probability of cost equal to the entry of the emission probability matrix that correspond to emitting symbol Xi from the state pi I. Well, let's compute these probabilities. For the first column, we have fair state emitting tail, which is of course, 1 over 2. Then fair state emitting heads, which is again, 1 over 2. Fair state emitting tails, 1 over 2. And now, biased state emitting heads, which is 3 over 4. If you continue like this, we will find out all these probabilities. Another thing that we need to compute to find Pr, probably of x pi is the probability of transitioning from the state pi i, to the state pi i 1 plus 1 which is simply transition probability from the state pi i to the state pi i plus 1. Let's start computing those probabilities. Well, at the first step, we transition from fair states to fair states with its probability 0.9. At the next step, again, from fair states to fair states, again 0.9. But at the next step, we transition from fair state to biased state, which has probability 0.1. We'll continue this way and in this way, we will compute all these values. We forgot about just one value which is what is the probability that our HMM was in the fair or biased state in the first moment. Let's assume that in the initial moment HMM was equally likely to be either in fair state or in the biased state. That's why we have this term 0.5. So after we filled this row, we know that probability of pi, probabilities of HMM will undergo, surpass pi will be simply product of the corresponding transition probability. And what is after you computed these values probability of pi and probability of x given pi, we can compute probability of x pi which is simply product of all of these small probabilities that we computed, equal to probability of x given pi multiplied by probability of pi. So the problem of computing probability x pi has been reduced to two problems. The first problem is how to compute probability of pi, and the second problem is how to compute the probability of an emitted strain given a hidden path, and this is the problem of computed probability of X given pi. And we know now how to solve both these problems.