0:01

Now, there's an important other way to write matrix transformations down.

Â It's called the Einstein's Summation Convention.

Â And that writes down what the actual operations are on the elements of a matrix,

Â which is useful when you're coding or programming.

Â It also lets us see something neat about the dot product that I want to show you.

Â And it lets us deal with non-square matrices.

Â When we started, we said that multiplying a matrix by a vector

Â or with another matrix is a process of taking every element in each row in turn,

Â multiplied with corresponding element in each column in the other matrix,

Â and adding them all up and putting them in place.

Â So, let's write that down just to make that concrete.

Â So I'm going to write down a matrix A here,

Â and I'm going to give it elements. A's an n by n matrix.

Â I'm going to give it elements a11,

Â a21, all the way down to an1.

Â And then a12, all the way across to a1n.

Â And then I'll have a22 here all the way across,

Â all the way down until I fill it all in and I've got ann down here.

Â So the first suffix on this matrix,

Â first suffix on all of these elements in the matrix is the row number,

Â and the second one is the column number.

Â Now, if I want to multiply A by another matrix B,

Â and that's also going to be an n by n matrix,

Â and that will have elements b11,

Â b12 across to b1n,

Â and down to bn1 and across to bnn,

Â dot, dot, dot, dot, dot, dot, dot, dot, dot.

Â If I multiply these together, I'm going to get another matrix,

Â which I'll call AB,

Â and then what I'm going to do is I'm going to take a row of

Â A multiplied by the elements of a column of B and put those in the corresponding place.

Â So let's do an example.

Â So if I want an element,

Â let's say ab, element two, three.

Â I'm going to get that by taking row two of A,

Â multiply by column three of B.

Â So I'm going to take row two of A,

Â that's going to be a21,

Â a22, and all the others up to a2n,

Â and I'm going to multiply it by column three of B.

Â So that's b13, b23,

Â all the way to bn3.

Â And I'm going to add all those up.

Â And I'll have a dot, dot, dot in between.

Â So that's going to be this element, row two,

Â column three of AB.

Â Now, in Einstein's convention,

Â what you do, is you say, well okay,

Â this is the sum over some elements

Â j of aij, bjk.

Â So if I add these up over all the possible j's,

Â I'm going to get a11,

Â b11 plus a12, b21,

Â and so on, and so on, and that's for i and k as well.

Â I'm going to then go around all the possible i's and k's.

Â So, what Einstein then says, well okay,

Â if I've got a repeated index,

Â I won't bother with the sum and I'll just write that down as being aij, bjk.

Â And that's equal to this the product abik.

Â So abik is equal to ai1, b1k,

Â plus ai2, b2k, plus ai3, b3k

Â and so on and so on,

Â until you've done all the possible j's,

Â and then you do that for all the possible i's and k's,

Â and that will give you your whole matrix for AB, for the product.

Â Now, this is quite nice.

Â If you are coding, you just run three loops over i, j and k,

Â and then use an accumulator on the j's here to find the elements of the product matrix AB.

Â So the summation convention gives you a quick way of coding up these sorts of operations.

Â Now, we haven't talked about this so far but now we can see it.

Â There's no reason, so long as the matrices have the same number of entries in j,

Â then we can multiply them together even if they're not the same shape.

Â So we can multiply a two by three matrix,

Â something with two rows and three columns.

Â So one, two, three, one, two, three,

Â by a three by four matrix,

Â three there and four there.

Â So it's got one, two, three, four times.

Â And when I multiply these together,

Â I'm going to go that row times that column.

Â I've got the same number of j's in each case,

Â so and then I'm going to be able to do that for all of the possible columns,

Â so I'm going to get something with four columns.

Â And I'm going to be able to do that for the two rows here.

Â I'm going to be able to do that row times that one, is going to get a

Â two by four matrix out.

Â So it's going to have one, two,

Â three, four, one, two, three, four.

Â So I can multiply together these non-square matrices if I want to,

Â and I'll get, in the general case,

Â some other non-square matrix.

Â I'm going to have the number of rows of the one on

Â the left and the number of columns of the one on the right.

Â Now, all sorts of matrix properties that you might want,

Â inverses and so on,

Â determinants, all start to get messy and mucky,

Â and you somehow can't even compute them when you're doing this sort of thing.

Â But there are times when you want to do it.

Â And the Einstein summation convention makes it very easy

Â to see how you do it, and how it's going to work.

Â As long as you got the same number of j's,

Â you're good, you can multiply them together.

Â Now, let's revisit the dot product in light of the summation convention.

Â So if we've got two vectors,

Â let's call them u and v, and we'll say,

Â u is a column vector having elements ui and v is another column vector having elements vi.

Â And when we dot them together,

Â what we're doing is we're multiplying u1 by v1,

Â adding u2, v2, all the way up.

Â So in the summation convention, that's just ui,

Â vi, while we repeat over all the i's and add.

Â But this is just like writing u as

Â a row matrix pushing u over from being a vector to being a matrix with elements u1,

Â u2, all the way up to un,

Â and multiplying it by another matrix v1,

Â v2, all the way up to vn.

Â That's to say that matrix multiplication is the same thing as the dot product.

Â I'll just push the u vector over,

Â and then my dot product is just like doing

Â a matrix multiplication, which is sort of neat.

Â There's some equivalence between a matrix transformation,

Â a matrix multiplication, and the dot product. So let's look at that.

Â Now, if I take a unit vector here,

Â let's call him u hat,

Â with components u1 and u2.

Â And let's imagine what happens if I dot him with the axis vector.

Â So if I've got an axis here e1 hat,

Â which would be one, zero.

Â And I've got another axis here e2 hat,

Â which will be zero, one.

Â Now, let's think about what happens when I dot u hat with e1.

Â When I do the projection of u hat onto e1.

Â So when I drop u hat down onto the axis here,

Â when I do the projection of u hat onto e1,

Â I'm going to get the length here just of u1,

Â just of the x-axis element of u hat.

Â Now, what happens if I drop,

Â project e1 onto u hat.

Â Well, I'm then going to get this projection.

Â I'm going to get a length here,

Â this projected length here.

Â Now, the fun thing is,

Â we can actually draw a line of symmetry here

Â through these two projections where they cross.

Â And this little triangle and this little triangle are actually the same.

Â You can go and do a bit of geometry and prove to yourself that that's true.

Â So this length here, this projection,

Â that projection is the same length as that projection,

Â which is implied by the dot product.

Â If I dot e1 with u hat,

Â when I do this multiplication here,

Â it's symmetric, I can flip around and I get the same answer.

Â So this show geometrically why that's true.

Â And if we repeat this with the other axes,

Â with e2 here or any other axes there are,

Â then we'll also get the same results.

Â So this is why the projection is symmetric and

Â the dot product is symmetric and why projection is the dot product.

Â So there is this connection between this numerical thing,

Â matrix multiplication, and this geometric thing, projection.

Â Which is quite beautiful and mind blowing really.

Â And that's why we talk about a matrix multiplication with a vector as

Â being the projection of that vector onto the vectors composing the matrix,

Â the columns of the matrix.

Â So what we've done in this video is look at the summation convention,

Â which is a compact and computationally useful,

Â but not very visual way to write down matrix operations.

Â And that's opened up looking at funny shaped matrices,

Â and that's opened up re-examining the dot product here. So that's really nice.

Â