In the previous videos, we talked about linear functionals. A linear functional is a thing which maps vectors to scalars, and we discovered that any linear functional on the space Ash can be obtained from the vectors of a space by operation called hermitian conjugation. Means that for any linear functional, we can find the vector Phi, which hermitian conjugate defines this functional. We agreed to represent this functional as a vector row with conjugated components. Now imagine that we have functional defined by some vector Phi, and I'm going to define another functional based, on this bra Phi and some linear operator A. My new functional, I'm going to denoted this Phi A, will act like this: for any vector Psi we first apply operator A to it, and then bra Phi. This new thing is indeed a linear functional. It takes vectors as its input and returns scalars. You can easily check yourselves that its action is linear. But since it is a linear functional, there must be some vector Phi A which hermitian conjugate defines it, and this vector is defined by the initial vector Phi and the operator A. This operation, the construction of a vector row from another vector row with the help of some operator, is called the action of the operator on the left. So with any bra, which means any linear functional from the conjugate space, we can apply this operator A like this on the left, to construct another bra. In the matrix representation, this operation is defined much simpler. It is again the multiplication of two matrices, the vector row, which represents our initial functional bra Phi, and the matrix of operator A. When you multiply a row by a matrix, you obtain another row. This is what happens here. The action of A on the left is a matrix multiplication, where A stands on the right and its argument on the left of it. When this bra Phi A acts on some vector Psi in the matrix representation, we can write it like this. Row Phi multiplied by matrix A and then multiplied by column Psi. Since it doesn't matter in which order we perform these operations, we are going to rewrite it in Dirac's notation as this. This thing here is called the matrix element of A, corresponding to the vectors Phi and Psi. This name comes from simple observation. If instead of Phi and Psi, you take the vectors on the orthonormal basis with all zeros except one place I and J, for example, then this expression will give you the component of the matrix A on the row I and column J. So, to compute this, we can go two ways. First, apply A on the left on Phi, and then compute the scalar product of the result on Psi. Or you can apply it on the right, on Psi, and compute the scalar product of Phi and A Psi. Both ways will give you the same answer. So we are not going to place any of this parenthesis anywhere in this type of expressions. Now we remember that each bra corresponds to some ket. So these bra Phi A also corresponds to some ket Phi A. Is a way to obtain this Phi A ket from the ket Phi? Again, if we reflect on this question a bit, we'll notice that the ket Phi A is uniquely defined by the ket Phi and the operator A. In other words, we just defined a way of constructing one vector from another: vector Phi A from vector Phi, for example. The way of constructing one vector from another is an operator. Remember operators transform vectors to vectors, and this is exactly what we just did. There must be some operator, A star, which transforms vectors just the way we do it with V and VA. I call it a star to stress the fact that this operator is somehow connected with the operator A. The operator A star is called the adjourned operator of operator A. What can we say about the operator A star? First, we can easily prove that it's linear. Second, it is defined by the operator A alone. There must be the way of constructing A star from A for any linear operator A. The most interesting part is this. Since the ket VA is the hermitian conjugate of the bra VA, we can write it like this. The hermitian conjugate of A acting on the left of bra V is A star acting on the right of V. This allows us to define the operation of Hermitian conjugation on operators. The operation of hermitian conjugation. First, transform complex numbers to their conjugates. Second, transforms kets to bras, which means vectors to linear functional. Third, transform bras to kets, which means linear functional to vectors. Fourth, it transforms operators to their adjourn. It is very convenient for us to notice that when we perform the operation of hermitian conjugation on some expression, you have to do two simple things. First, substitute each element in the expression by its hermitian conjugate, and second, re-write all elements in the reverse order. The second rule is not strict for the scholars. It is usual to write them in the beginning of the expression. Now, when we know how the joint operator is defined, it will be good for us to discover how to obtain its matrix. Imagine we have some operator A and we have its matrix. Well, in the previous episode we considered the action of A and some vector x, I told you that you can imagine it as a separate action of linear functionals represented by the rows of the matrix A. Since we remember that hermitian conjugation transforms vector rows to vector columns, this conjugate the components. It is exactly what we need to expect from the hermitian conjugation of a mother. Its rows become columns with conjugated components. For any matrix A is a giant, matrix A star is just transposed matrix A with its components conjugated. It is as simple as that. Now, when you know all that, you may notice that some operators, as well as some numbers, are neutral with respect to this hermitian conjugation. The numbers are neutral to it if they are real with zero imaginary part. The operators are neutral. If after transposition and conjugation of all the components, they stay the same operators. This type of operators which are neutral with respect to hermitian conjugation are called self-adjoint or hermitian operators. They're very important in quantum mechanics because they represent observables. We already talked about observables and I told you that an observable is defined by an orthonormal buses and the state-space. Now I tell you that an observable is a hermitian operator, how's that? Let's see, in the next episode.