We are only one step from being able to construct the I measure mu^*. Before we do so, we need two lemmas, which are two identities. One is an identity in set theory. And the other is an identity in information theory. First we have in lemma 3.7 the set identity, which says that mu(A intersect B minus C) is equal to mu(A union C) plus mu(B union C) minus mu(A union B union C) minus mu(C). Then in lemma 3.8, we have the information theoretic identity which says that I(X;Y given Z) is equal to H(X,Z) plus H(Y,Z), minus H(X,Y,Z), minus H(Z). Note that these two lemmas are related to each other through the substitution of symbols. Namely, A corresponds to X, B corresponds to Y, C corresponds to Z, etc. First we prove Lemma 3.7. We write mu(A intersect B minus C) as mu(A minus C) plus mu(B minus C) minus mu(A union B minus C). Note that by covering minus C throughout the identity we simply have mu(A intersect B) equals mu(A) plus mu(B) minus mu(A union B). Now mu(A minus C) can be written as mu(A union C) minus mu(C). Mu(B minus C) can be written as mu(B union C) minus mu(C). And mu(A union B minus C) can be written as mu(A union B union C) minus mu(C). Now this mu(C) cancels with this mu(C). And so, we have mu(A union C) plus mu(B union C) minus mu(A union B union C) minus mu(C). Now we prove lemma 3.8. I(X;Y given Z) is equal to H(X given Z), minus H(X given Y,Z), where H(X given Z) is equal to H(X,Z) minus H(Z). And H(X given Y,Z) is equal to H(X,Y,Z) minus H(Y,Z). Upon rearranging the terms we have H(X,Z) plus H(Y,Z) minus H(X,Y,Z) minus H(Z). We are now ready to construct the I measure mu^* on F_n. First, recall in theorem 3.6 that a signed measure mu on F_n is completely specified by the values of mu on the unions. This set of numbers can be any set of real numbers. We define mu^* by setting mu^* tilde{X_G} to be H of X_G for all nonempty subsets G of the index set N. That is, the value of mu^* on the union of a collection G of set variables is equal to the joint entropy of the collection G of random variables. Now, mu^* is meaningful only if it is consistent with all Shannon's information measures via the substitution of symbols. That is, the following must hold for all not necessarily disjoint subsets G, G prime, and G prime prime, of the index set N, where G and G prime are non-empty. This condition says that mu^* tilde{X_G} intersect tilde{X_G prime} minus tilde{X_G prime prime} is equal to I(X_G semicolon X_G prime given X_G prime prime). That is, mu^* is consistent for conditional mutual information. When G prime prime is equal to the empty set, we have mu^* tilde{X_G} intersect tilde{X_G prime} is equal to I(X_G;X_G prime). That is, mu^* is consistent for mutual information. When G is equal to G prime, tilde{X_G} intersect tilde{X_G prime} is equal to tilde{X_G}, and tilde{X_G prime} is equal to X_G. So we have mu^* tilde{X_G} minus tilde{X_G prime prime} is equal to H(X_G given X_G prime prime). That is, mu^* is consistent for conditional entropy. Finally, when G is equal to G prime, and G prime prime is equal to the empty set, we have mu^* tilde{X_G} is equal to H(X_G). That is, mu^* is consistent for entropy. Now we state theorem 3.9, the most fundamental theorem in the theory of the I measure, which says the following, mu^* is the unique sign measure, on the field F_n, which is consistent with all Shannon's information measures. The implications of theorem 3.9 are the following. Because of this theorem, we can formally regard Shannon's information measures for n random variables, as the unique signed measure, mu^* defined on the field F_n. A consequence is that we can employ set-theoretic tools to manipulate expressions of Shannon's information measures. We now prove theorem 3.9. First, recall that I measure mu^* that we have constructed is defined by mu^* tilde{X_G} is equal to the entropy of X_G for all nonempty subsets G of the index set N. In order for any signed measure mu to be consistent with all Shannon's information measures, it must be consistent with all entropies. That is, mu of tilde{X_G} is equal to the entropy of X_G for all non-empty subsets G of the index set N. In other words, mu must be equal to mu^*. And therefore, mu^* is the unique signed measure that is consistent with all Shannon's information measures. It remains to show that mu^* is consistent with all Shannon's information measures. This is shown by considering all for subsets G, G prime, and G prime prime of N, the quantity mu^* tilde{X_G} intersect tilde{X_G prime} minus tilde{X_G prime prime}. Now by Lemma 3.7, this can be written as mu^* tilde{X_G union G prime prime} plus mu^* tilde{X_G prime union X_G prime prime} minus mu^* tilde{X_G union G prime union G prime prime}, minus mu^* tilde{X_G prime prime}. By the definition of mu^*, this can be written as the entropy of X_G union G prime prime plus the entropy of X_G prime union G prime prime minus the entropy of X_G union G prime union G prime prime minus the entropy of X_G prime prime. And then by lemma 3.8, this is simply equal to the mutual information between X_G and X_G prime conditioning on X_G prime prime. And this proves theorem 3.9.