Let's tackle another common idea in linear algebra, which is thought of as the norm of a vector. You can think of the norm of a vector as, intuitively at least, the length. General vector, it can be anything. Let's talk about maybe the v vector. It'll be v_1, v_2. These are just values. The values don't matter for now, so we're just giving them hypothetical values and we're giving it a hypothetical length. For instance, I might have v is 1, 2, 3 column vector, in which case n would be 3, my v_1 be 1, v_2 would be 2, v_3 will be 3. As we're probably used to by now, this just represents hypothetical values in a hypothetical column vector. That's just what we're dealing with here. In general, the norm of a vector deals with v.v. We want to multiply v times itself in the sense that we've just learned the last lesson with dot product. You can't actually multiply v times v, because the dimensions don't make any sense. But if you want to do v.v, that does make sense. What that equates to is v transpose times v, which we knew from last time. What that looks like is v_1, v_2 up to v_n, times v_1, v_2 up to v_n. Just like last time, when we multiply these, what are we going to get? Well, this is a 1 by n and an n by 1. This is a 1 by n, this is an n by 1. So the inners match, which we've already learned. Yes, we can still multiply it. The outsides tell me the dimension of what I'm going to get, which is a 1 by 1, which is pretty much just going to be a number. Just like last time, that simpler example. We're doing the exact same thing here, but just with a hypothetical vector. What this looks like is a 1 by 1, but the algebra, just like last time when we did the algebra, it's going to be v_1 times v_1, so v_1 squared plus v_2 times v_2, plus all the way up to v_n squared. This is exactly what a norm of a vector is. We usually denote this by two vertical lines. We don't always want this though, we actually want the square root of this. We want the norm of the vector. This is the norm of a vector squared. What we want here is we want v with two vertical lines. This is the norm of v. That equals, well, instead of v_1 squared plus v_2 squared plus all this term, we want this term, we just don't want to square it. It's the square root of v_1 squared plus v_n squared. We didn't need to know what a dot product was, we learned it last time. This is why, because more things as we go into a certain offshoot of linear algebra, a lot of popular things come up. They need other things. Last time we learned what a transpose was and we're using it here. Because we're learning it in the definition of a dot product, which we also learned last time. Now we know transpose, we know dot product. If we're curious about given a certain vector, what its length or norm is, we cannot calculate it right here. Again, the norm of any vector here is just the square root of each of its elements squared independently and then added to each other.