In this course on Linear Algebra we look at what linear algebra is and how it relates to vectors and matrices. Then we look through what vectors and matrices are and how to work with them, including the knotty problem of eigenvalues and eigenvectors, and how to use these to solve problems. Finally we look at how to use these to do fun things with datasets - like how to rotate images of faces and how to extract eigenvectors to look at how the Pagerank algorithm works.

This course is part of the Mathematics for Machine Learning Specialization

**294,647**already enrolled

Offered By

## About this Course

## Skills you will gain

- Eigenvalues And Eigenvectors
- Basis (Linear Algebra)
- Transformation Matrix
- Linear Algebra

## Offered by

### Imperial College London

Imperial College London is a world top ten university with an international reputation for excellence in science, engineering, medicine and business. located in the heart of London. Imperial is a multidisciplinary space for education, research, translation and commercialisation, harnessing science and innovation to tackle global challenges.

## Syllabus - What you will learn from this course

**2 hours to complete**

## Introduction to Linear Algebra and to Mathematics for Machine Learning

In this first module we look at how linear algebra is relevant to machine learning and data science. Then we'll wind up the module with an initial introduction to vectors. Throughout, we're focussing on developing your mathematical intuition, not of crunching through algebra or doing long pen-and-paper examples. For many of these operations, there are callable functions in Python that can do the adding up - the point is to appreciate what they do and how they work so that, when things go wrong or there are special cases, you can understand why and what to do.

**2 hours to complete**

**2 hours to complete**

## Vectors are objects that move around space

In this module, we look at operations we can do with vectors - finding the modulus (size), angle between vectors (dot or inner product) and projections of one vector onto another. We can then examine how the entries describing a vector will depend on what vectors we use to define the axes - the basis. That will then let us determine whether a proposed set of basis vectors are what's called 'linearly independent.' This will complete our examination of vectors, allowing us to move on to matrices in module 3 and then start to solve linear algebra problems.

**2 hours to complete**

**3 hours to complete**

## Matrices in Linear Algebra: Objects that operate on Vectors

Now that we've looked at vectors, we can turn to matrices. First we look at how to use matrices as tools to solve linear algebra problems, and as objects that transform vectors. Then we look at how to solve systems of linear equations using matrices, which will then take us on to look at inverse matrices and determinants, and to think about what the determinant really is, intuitively speaking. Finally, we'll look at cases of special matrices that mean that the determinant is zero or where the matrix isn't invertible - cases where algorithms that need to invert a matrix will fail.

**3 hours to complete**

**7 hours to complete**

## Matrices make linear mappings

In Module 4, we continue our discussion of matrices; first we think about how to code up matrix multiplication and matrix operations using the Einstein Summation Convention, which is a widely used notation in more advanced linear algebra courses. Then, we look at how matrices can transform a description of a vector from one basis (set of axes) to another. This will allow us to, for example, figure out how to apply a reflection to an image and manipulate images. We'll also look at how to construct a convenient basis vector set in order to do such transformations. Then, we'll write some code to do these transformations and apply this work computationally.

**7 hours to complete**

## Reviews

- 5 stars74.66%
- 4 stars19.82%
- 3 stars3.39%
- 2 stars1.15%
- 1 star0.95%

### TOP REVIEWS FROM MATHEMATICS FOR MACHINE LEARNING: LINEAR ALGEBRA

Good course with nice lecturer.

Some topics should be explain more in detail and have some further reading / exercise for practicing.

For overall, this course is worth the time and money spend.

Great content and direction. Only negative is the sometimes frustrating experience with the Jupyter Notebooks: debugging what has gone wrong is very difficult, due to a lack of good error messages.

Excellent review of Linear Algebra even for those who have taken it at school. Handwriting of the first instructor wasn't always legible, but wasn't too bad. Second instructor's handwriting is better.

Brilliantly explained, loved the use of different marker which helped to understand better. Only one suggestion, if the summary has the mathematical equations/python equivalent would be helpful.

## About the Mathematics for Machine Learning Specialization

For a lot of higher level courses in Machine Learning and Data Science, you find you need to freshen up on the basics in mathematics - stuff you may have studied before in school or university, but which was taught in another context, or not very intuitively, such that you struggle to relate it to how itâ€™s used in Computer Science. This specialization aims to bridge that gap, getting you up to speed in the underlying mathematics, building an intuitive understanding, and relating it to Machine Learning and Data Science.

## Frequently Asked Questions

When will I have access to the lectures and assignments?

What will I get if I subscribe to this Specialization?

Is financial aid available?

More questions? Visit the Learner Help Center.