Hi everyone. This is Professor Yoong Jin-Yoon from KAIST. Now, we are the first session of week 4, basic math for our AI beginner part 1 on linear algebra. In week 4, we are going to study about the linearly independent vectors and inverse of matrix, which is very important not only for the engineering problem, but also for the artificial intelligence algorithm. With the expansion of the concept for the linear combination of vectors in the last week, let me introduce about the linearly independent vectors. We have vectors W_1, W_2, and to the W_p with vectors in a vector space. Estimation error vector R_N. We say that the W_1, W_2, and W_P minus 1 and W_P are linearly independent vectors if we cannot find any one of these vectors to be a linear combination of the other vectors. We just study about what is the meaning of the linear combination of vectors. Linear combination is what? If you have a certain vector, for example here, W_1, then if W_1 can be expressed with other vector, W_p to the W_p with a linear combination, then equal, that's the linear combination of the vectors. Then how do we check if W_1, W_2 to the W_p are linearly independent? First, we can form the homogeneous system, C_1 W_1 plus C_2 W_2 plus C_p minus 1, W_p minus plus C_p W_p equals 0. This is form of homogeneous system for the linear algebra equation. If C_1 equals C_2, equals C_p equals 0 is the only solution of the system, then the vectors are linearly independent. How can you say that? Here, we have two important concept to understand the linearly independent vector. What is that? First one is linear combination. Second one is a homogeneous linear algebraic system for the unique solution. That's what we just learned in the last week. Why does this work? This is a little bit complicated or difficult. So let me explain one-by-one. The condition for the linearly independent vector for W_1 to the W_p is when you form the homogeneous linear algebra equation, C_1 W_1 plus C_2 W_2 plus to the C_p W_p equals 0. If we only can find C_1, C_2 up to P_2 equal zero is the only solution, then these vectors are linearly independent. Let me prove this one. For example, if we can find, C_1 which is not zero, then we can write W_1 equals minus C_2 over C_1 W_2 minus C_3 over C_1, W_3 up to the minus C_p over C_1, W_p like that because C_1 is not zero. It means W_1 is linear combination of the W_2, W_1 to the W_p. From the definition, we can express W_1 with linear combination of W_2, W_3 to W_p, so the vectors are not linearly independent. Let me explain this theory with an example. For example, are those three vectors are linearly independent? The three vectors are one, one, minus one; two, one, one, and one, zero, minus one. If we form the homogeneous linear algebra equation with C_1, C_2, C_3, let's say this one, one, minus one is W_1, two, one, minus one is W_2, one, zero, minus one is W_3. Then you can form C_1 W_1, plus C_2 W_2 plus C_3 W_3 equals zero. This is the homogeneous system of linear algebraic equation. Here, we need to check whether we can find unique solution, C_1, C_2, C_3 becomes zero. Because the unique solution for the homogeneous system of linear algebra equation is what? X equals zero. So C_1, C_2, C_3 equals zero. Then it means it's linearly independent, which means each vector cannot make our linear combination with other vectors. Let's make a tabular form with this system of homogeneous linear algebra equation, then it become one, one, minus two; one, one, two; and zero, one, one, and minus one, minus one, minus one, and zero, zero, zero like this. In this case, the C_1, C_2, C_3 equal zero, the only solution. In that case, we can say those three vectors are linearly independent. Let's make this tabular form to the upper triangular matrix. Then it become like this, one, one, two; zero, one, one, and zero, zero, one, and in the right-hand side becomes zero, zero, zero like this. Here, we can see the diagonal element of the upper triangular matrix is one, one, one, which is not zero. It means we have unique solution, which is C_3 equals zero, C_2 equals zero, C_1 equals zero. You can't find it. In this case, the vectors are linearly independent, because we can find unique solution in the homogeneous linear algebra equation, so we cannot express each vector with other vectors in the linear combination. So we say vectors are linearly independent. How about other case, one, one, two; two, two, four; one, two, three? Again, we can formulate this homogeneous linear algebra equation C_1 times W_1, which is one, one, two, plus C_2 times W_2, which is two, two, four, plus C_3 times W_3; one, two, three equal zero, zero, zero. Let's make table from here. Then one, two, one; one, two, two; two, four, three like this. Right-hand side become zero, zero, zero. Once you do the row operation to make the upper triangular matrix, you can see that there is a zeros in the diagonal element. In that case, this homogeneous linear algebra equation has infinitely many solutions. C_1, C_2, C_3 equals zero is not the only unique solution, we can find on other C1, C2, C3. For example, in a possible non-trivial solution can be C_1 equal two, C_2 equals minus one, and C_3 equals zero. In that case, we can express the vectors to the other vector. For example, we can plug in the C_1, C_2, C_3 value to the homogeneous linear algebra equation. It become two times one, one, two minus two, two, four plus zero. One, two, three equals zero, zero, zero. In that case, two, two, four, second vector, can be expressed with other vector, for example, two times one, one, two. Those are not linearly independent. Up to here, we study about the linearly independent vectors by using the concept of the homogeneous system of linear algebraic equations, solving this homogeneous system linear algebra equation, plus, the linear combination. For the next session, you are going to study about the other concept by using the row operation about some inverse of matrix. Thank you very much.