Definition 2.6 is the definition of a Markov chain. For random variables X_1, X_2, up to X_n, where n is bigger than or equal to 3, X_1, X_2, up to X_n forms a Markov chain in this order. If p(x_1,x_2, ..., x_n) is equal to p(x_1,x_2), p(x_3|x_2) all the way to p(x_n|x_{n-1}). If p(x_2), p(x_3) all the way to p(x_{n-1}) is bigger than 0. Note that if these probabilities are not bigger than 0, then the conditional probabilities are not defined. Otherwise, p(x_1,x_2, ..., x_n) is equal to 0. Here's a remark. X_1, X_2, X_3 forms a Markov chain in this order, is equivalent to the condition that X_1 is independent of X_3 given X_2. Proposition 2.7 says that X_1, X_2, up to X_n forms a Markov chain in this order, if and only if X_n, X_{n-1}, up to X_1 forms a Markov chain. Basically, it says that if you have a Markov chain in a certain order, if you reverse the order then it is still a Markov chain. The proof of this proposition is left as an exercise. Proposition 2.8 says the following: X_1, X_2, up to X_n, forms a Markov chain if and only if the following sequence of Markov chains holds. First, we have the Markov chain X_1, X_2, X_3. And then the Markov chain X_1, X_2 as a pair and then X_3 and X_4, and all the way to the last Markov chain X_1, X_2 up to X_{n-2}, that's a group of random variables, and then X_{n-1} and then X_n forms the Markov chain, the proof of this proposition is left as an exercise. Proposition 2.9, says that X_1, X_2, up to X_n, forms a Markov chain if and only if p(x_1,x_2, ..., x_n) can be factorized as some function f_1(x_1,x_2), f_2(x_2,x_3), all the way to f_{n-1}(x_{n-1}, x_n). For all x_1, x_2 up to x_n, such that p(x_2), p(x_3) all the way to p(x_{n-1}) is positive. Notice that proposition 2.9 is a generalization of proposition of 2.5 that we have seen before. Proposition 2.10 is about Markov subchains. This proposition looks a little complicated but in fact the idea is very simple. Let N_n be the index set containing the integers 1, 2 up to n and we let X_1, X_2 up to X_n form a Markov chain. For any subset alpha of the index set, denote the collection of random variables, X_i, i in alpha by X_alpha. Then, for any disjoint subsets alpha_1, alpha_2, up to alpha_m, of index set such that k_1 is less than k_2 is less than all the way to k_m for all k_j and alpha_j, j equals 1, 2 up to m. We have the Markov chain X_{alpha_1} X_{alpha_2} all the way to X_{alpha_m}, that is, a subchain of X_1, X_2 up to X_n, is also a Markov chain. We'll leave the proof of this proposition as an exercise. The condition that these disjoint subsets alpha_1, alpha_2, up to alpha_m need to satisfy is illustrated in this figure. If this condition is satisfied, then we have the Markov chain X_{alpha_1} X_{alpha_2} up to X_{alpha_m}.