Chevron Left
Back to Mathematics for Machine Learning: Multivariate Calculus

Learner Reviews & Feedback for Mathematics for Machine Learning: Multivariate Calculus by Imperial College London

4.7
stars
5,642 ratings

About the Course

This course offers a brief introduction to the multivariate calculus required to build many common machine learning techniques. We start at the very beginning with a refresher on the “rise over run” formulation of a slope, before converting this to the formal definition of the gradient of a function. We then start to build up a set of tools for making calculus easier and faster. Next, we learn how to calculate vectors that point up hill on multidimensional surfaces and even put this into action using an interactive game. We take a look at how we can use calculus to build approximations to functions, as well as helping us to quantify how accurate we should expect those approximations to be. We also spend some time talking about where calculus comes up in the training of neural networks, before finally showing you how it is applied in linear regression models. This course is intended to offer an intuitive understanding of calculus, as well as the language necessary to look concepts up yourselves when you get stuck. Hopefully, without going into too much detail, you’ll still come away with the confidence to dive into some more focused machine learning courses in future....

Top reviews

DP

Nov 25, 2018

Great course to develop some understanding and intuition about the basic concepts used in optimization. Last 2 weeks were a bit on a lower level of quality then the rest in my opinion but still great.

SS

Aug 3, 2019

Very Well Explained. Good content and great explanation of content. Complex topics are also covered in very easy way. Very Helpful for learning much more complex topics for Machine Learning in future.

Filter by:

751 - 775 of 1,002 Reviews for Mathematics for Machine Learning: Multivariate Calculus

By Arnoldy C

Mar 28, 2023

good

By Fitrah S

Mar 17, 2023

cool

By Doni S

Mar 27, 2022

Good

By Burra s g

Jan 18, 2022

good

By 李由

Aug 23, 2021

good

By Dwi F D S

Mar 23, 2021

good

By Ahmad H N

Mar 16, 2021

Good

By Habib B K

Mar 12, 2021

Nice

By Indah D S

Feb 27, 2021

cool

By RAGHUVEER S D

Jul 25, 2020

good

By Nat

Mar 6, 2020

goot

By Zhao J

Sep 11, 2019

GOOD

By Harsh D

Jun 26, 2018

good

By Amini D P S

Mar 26, 2022

wow

By Roberto

Mar 25, 2021

thx

By Artem G

May 28, 2022

:)

By Angel E E V

Nov 30, 2021

:)

By Omar D

May 5, 2020

gd

By Гончарова П В

May 10, 2022

2

By Aidana P B

Apr 26, 2021

щ

By Bhargava g

Aug 7, 2020

.

By Kaushal K K

Apr 23, 2022

A good, brief overview of the topics in multivariate calculus relevant to machine learning and optimisation. It may not necessarily go deep enough to make you an expert in solving problems in multivariate calculus that might be seen at the university level; rather, it goes just deep enough to enable you to understand how multivariable calculus operates in various machine learning scenarios. Some of these scenarios include:

(1) The process of backpropagation in basic neural networks.

(2) Using the Newton-Raphson method to find the roots of a function in the multivariate case.

(3) Use of the Taylor series to approximate a function in the multivariate case, and how such an approximation can be used for optimisation.

(4) Using gradient descent to reach the nearest minimum points in the parameter space, so as to optimise the parameters in a machine learning model with multiple parameters.

The quizzes provide a few example problems for us to work on, but as mentioned earlier, they are of the more basic variety; it is quite unlikely that undergraduate courses have examples that are this straightforward. However, I feel that this is a good thing, given that their aim is only to allow us to get a feel for multivariable calculus without bogging us down with needless complexity.

The overall aim of the course is to build intuition, which I think it accomplishes.

However, compared to the previous course in this specialization, it is harder to draw the links between the material that is covered in one week as compared to the next. It is harder to see how they are related, and how the material for each week fits into the overall picture. This was not the case in the previous course. The concepts from the previous weeks would be seemlessly integrated into those from the current week. There seems to be an unspoken expectation that the course participant should refer to external resources to fill in the blanks, and find the coherence within the material by themselves. I feel that the course instructors can do better at integrating the concepts taught across the weeks, so that it does not feel quite so fragmented.

By Rinat T

Aug 1, 2018

the part about neural networks needs improvement (some more examples of simple networks, the explanation of the emergence of the sigmoid function). exercises on partial derivatives need to be focused more on various aspects of partial differentiation rather than on taking partial derivatives of some complicated functions. I felt like there was too much of the latter which is not very efficient because the idea of partial differentiation is easy to master but not always its applications. just taking partial derivatives of some sophisticated functions (be it for the sake of Jacobian or Hessian calculation) turns into just doing lots of algebra the idea behind which has been long understood. so while some currently existing exercises on partial differentiation, Jacobian and Hessian should be retained, about 50 percent or so of them should be replaced with exercises which are not heavy on algebra but rather demonstrate different ways and/or applications in which partial differentiation is used. otherwise all good.

By Yaroslav K

Apr 8, 2020

1) Totally British English with a bunch of very rare-used words and phrases globally. 2) The pace of the course is just not suitable for me. If you don't have strong math or engineer background you will need to search for the explanations somewhere else (khan academy - a great resource, etc.). Closer to the end of the course I stopped having a full understanding of what's going on and why. So I could calculate things, but I don't feel that I will able to that in 1-2 week because I didn't have a time and opportunity to strengthen gained skills. 3) Also I don't understand why instructors (especially David) don't visualize what they say like Sal or Grant are doing. They draw on the desk and on the plots and so on. Sometime it looks like you just listen to audio-book about the Math.

I will take Stanford ML course after this course and also review what I've learned here with Khan Academy resource.

By Vitor R C

Sep 18, 2020

Another great introduction to a very hard content that is Multivariate Calculus, including derivatives, but still good enough for someone with a very little mathematic basis to understand

One critique that I have is the lack of a smooth progression between the examples used in the video with the ones presented in the quizzes, sometimes the questions in the quiz are an entirely different order of difficulty than the ones in the videos.

Another critique is the seemly dive in quality in the content of the videos in the last two "weeks" of the course, you can see that very well because theses weeks have at most 20 min worth of videos each, even though it's supposed to be done during an entire week, and the content is very shallow, quick and hard to understand.