In this lecture we will discover several examples of mission operators. Our first example will be the operator of this kind. They choose some unit vector fee in our state space and then in the bracket notation we write cat fee and then Brophy. I'm like this color product where broad proceeds cat. Here we have a reverse order, so it is not a number like in scalar product. But an appraiser? Let's see how it works. Consider the action of this operator on some vector from our state space. If you write the operator and then vector, C as we usually do when we apply an operator to a vector, we can see this expression is just get fee multiplied by scholar, product of fee and C. The first thing that immediately follows from this is that when we apply the operator fee to any vector, we always obtain a vector, collinear to fee. The operators of this kind are called the projection operators, since they project the whole space on just one vector. The Matrix representation such projection operator is a column multiplied by a row. Which as we remember, gives us the square matrix of the operator. Now, if you consider a set of mutually orthogonal unit vectors VI, then for each vector VI we can construct a projection VI. The sum of these projections is all the projection. It projects the whole space onto subspace generated by the set VI. And if the set fee I forms and orthonormal basis in the considerate space. Then the sum of all projections Pi becomes a projection of the space on itself. Thus it is the identity operator. This relation which associates the sum of projections to their normal basis and identity Is called the closure relation. And it's very useful for the theorem proving, since it allows to substitute identity by the sum of projections at any place of any formula which involves operators. So what are the solutions of the Eigen equation for the projection? First we may notice that for the vector fee itself, the projection on fee acts as identity, so fee is an eigenvector with eigenvalue one. And for any vector which is our toggle to fee, the projection gives us zero, which means that the second eigenvalue, or the separator is zero as Eigen Subspace has the dimensionality of N minus one, where N is the dimensionality of the whole space. You can easily analyze like this, any zoom of different projections. The more linearly independent projections you take. The more is the degree of degeneracy of the eigenvalue one and the less is the degree of degeneracy of 0. That identity is also the sum of projections and has a single eigenvalue one. Which is and degenerate. Now let's consider the operator called X. You can see it's matrix on the slide. This separator is very useful and important in quantum since it is a quantum analogue of the classical gate, not. If you apply X to vector zero, you obtain one and vice versa. So what are the eigenvalues and eigenvectors of this operator? Well. Since it flips 1 and 0. It is quite obvious that it preserves the states with which contains both these vectors in the same proportions. The state plus is an eigenvector with eigenvalue 1, and the state minus is an eigenvector with eigenvalue minus 1. So, we can see that x is not just an important operator. It is also a good observable, since it does not have degenerate eigenvalues. X is a matrix 2 by 2, which means that it acts in a 2 dimensional space. The two dimensional space is the state space of 1 cubit. And the cubit, as we remember, can be represented as a point on the block sphere. So, let's see how x acts geometrically on the block sphere. Vectors plus and minus are untouched by its action. Vector 0 comes to 1, and 1 to 0. And the rotational states also flip. 0 plus I1 goes to 0 minus I1 and vice versa. So, it looks like the operator x is just the rotation around the x axis on the angle pi. And it is so indeed. Operator X is the rotation around the axis X. It looks like the name of operator was not chosen at random. And we have two more access, Y and Z. Maybe there are rotation operators for them too. Yes indeed. Meet the operators Y and Z. The operator Y rotates the space around the Y axis, by angle pi. And that rotates the space around the Z axis, also by angle pi. You can guess their eigenvalues and eigenvectors yourselves. Let it be a small exercise. These three matrices, X, Y and Z, are called the Pauli Matrices in the name of German theoretical physicists, Wolfgang Pauli. They have many wonderful properties which we will not consider here. The important thing for us to this point is that each Pauli matrix is an observable with all eigenvalues being nondegenerate, good. Let's continue. The operator you can see on the slide is called Hadamard Transform in the name of the French mathematician Jack Hadamard. And as you can guess, it transforms the 01 matrix into the Hadamard matrix, and vice versa. I literally don't know any algorithm which does not employ these separator. So, what about its eigenvalues and eigenvectors? Well, to find this out, you can choose two ways. The first one is to solve a characteristic equation for the eigenvalues, and then the system of two linear equations with one unknown parameter. Sounds exciting, but don't hurry to do that, especially since I did not explain what a characteristic equation is, etc. On this picture, you can see that geometrical interpretation of Hadamard transform action. Zero it maps to plus and one it maps to minus. It is easy to see that both these actions are actually the reflection across this direction painted in red. So, the vector collinear to this direction is not modified by the Hadamard transform. This vector has the angle pi divided by 8 with zero vector, so we can easily compute its components, cosine pi divided by 8 and sine pi divided by 8. And its eigenvalue is 1. And we are now experienced enough to expect that the second eigenvector is orthogonal to the first one. Let's choose this one with components sine pi divided by 8 and cosine pi divided by 8. Hadamard transform maps it to Itself, multiplied by minus 1. So we can conclude that the second eigenvalue is minus 1. And no eigenvalues or the Hadamard transform are degenerate. Thus, Hadamard transform also represents the complete set of commission observables.