We are now ready to find the most likely hidden path in an HMM and it corresponds to the following decoding problems that ask us to find the optimal hidden path. The input is an emitted stream, and then HMM defined by four object alphabet, the set of state, the matrix of transition probabilities and the matrix of emission probabilities. And the output is a hidden path pi that maximizes the probability of x and pi. Probability of x pi as we saw equal to the product of probability of x given pi multiplied by probability of pi and it is a product of n term, each term being the product of emission probability and transition probability, as described in the slide. To solve the decoding problem, we will try to build a Manhattan graph. In such a way that every path in this graph will correspond to a hidden path in HMM, and vice versa. Every hidden paths in HMM will correspond to a path in the built Manhattan. For the coin flipping HMM with this HMM diagram, we will construct Manhattan consisting of two rows and n columns, where n is the number of emitted symbol. In this case, we will also add source and sink node to the beginning and to the end of the resulting directed acyclic graph. And for the HMM that omitted the strength of symbols FBBBFF, we will construct is the following paths through the Manhattan right here. And you see as that for every string of omitted symbols, we can construct the paths in the correspondent pattern. For a little bit more complex HMM consisting of three states, our Manhattan will be built of three rows and n columns. Generally if we have an HMM for this k state there will be k rows in this Manhattan. As before we will add start and end state and note that in difference from the Alignment Manhattan, when we had just three directions from every node, in the Decoding Manhattan we had many directions, anything that goes to the East works in this newly built Manhattan. We now need to define the weight of address means that newly built Manhattan graph for an HMM. Let's consider an edge then starts and go i minus y in state l and connects it to the node corresponding to state k in the column i. We will refer to this edge as l, k, i-1. Now this edge correspond to transition n from state l to state k with probability of transition lk. And it also correspond to emitting the symbol xi from the state k. And therefore we define the weight of this edge as weight l, k, i minus pi and is equal to the product of emission probability of emitting symbol x, i from state k, multiplied by transition probability from state l to state state k. Let's see how would we assign edge weight for Crooked Casino. In this case, as before, weight of the edge l, k, i-1 = the product of the corresponding emission and transition probability. What about this edge? So the weight of this bold edge from state B to state B are equal to emission probability of emitting symbol h from state B multiplied by a transition probability of moving from B to B which is 0.75 x 0.9. And we will continue further with the sign in, it's probability of all edges in a similar fashion. And afterwards we will construct a graph in which product weight of a hidden path will be simply equal to the product of weights of individual edges, as shown on this slide. For example, the product weight of this path will be simply equal to probability of x. Therefore we constructed a graph in which product weight of every path equal to probability of x pi, exactly what we're interested in. And therefore the computing of probability of x pi and finding hidden paths pi corresponding to maximizing probability x pi, is simply finding the path with this maximum product weight.