Chevron Left
Back to Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

Learner Reviews & Feedback for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization by DeepLearning.AI

4.9
stars
56,697 ratings
6,508 reviews

About the Course

This course will teach you the "magic" of getting deep learning to work well. Rather than the deep learning process being a black box, you will understand what drives performance, and be able to more systematically get good results. You will also learn TensorFlow. After 3 weeks, you will: - Understand industry best-practices for building deep learning applications. - Be able to effectively use the common neural network "tricks", including initialization, L2 and dropout regularization, Batch normalization, gradient checking, - Be able to implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence. - Understand new best-practices for the deep learning era of how to set up train/dev/test sets and analyze bias/variance - Be able to implement a neural network in TensorFlow. This is the second course of the Deep Learning Specialization....

Top reviews

AM
Oct 8, 2019

I really enjoyed this course. Many details are given here that are crucial to gain experience and tips on things that looks easy at first sight but are important for a faster ML project implementation

AS
Apr 18, 2020

Very good course to give you deep insight about how to enhance your algorithm and neural network and improve its accuracy. Also teaches you Tensorflow. Highly recommend especially after the 1st course

Filter by:

76 - 100 of 6,431 Reviews for Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization

By Ali Z

Nov 1, 2018

small description error on the last project. tensorflow tutorial project.

X, Y = create_placeholders(12288, 6)

print ("X = " + str(X))

print ("Y = " + str(Y))

X = Tensor("Placeholder:0", shape=(12288, ?), dtype=float32)

Y = Tensor("Placeholder_1:0", shape=(6, ?), dtype=float32)

Expected Output:

X Tensor("Placeholder_1:0", shape=(12288, ?), dtype=float32) (not necessarily Placeholder_1)

Correct this from Y Tensor("Placeholder_2:0", shape=(10, ?), dtype=float32) (not necessarily Placeholder_2)

to:

Tensor("Placeholder_2:0", shape=(6 ?), dtype=float32) (not necessarily Placeholder_2)

By Shahed B S

May 31, 2018

This course goes into the various parameters and hyperparameters of deep neural networks, as well as suggestive values for ones we can use. This course is short in duration, but a lot of content is developed in here. It touches in on Tensorflow. The template based assignments provide great intuition for getting right on to the topics being taught, however, I feel there should be scope for more programming assignments where the student should be able to write more of that template as well. All in all, Andrew Ng is a great teacher and it was a pleasure to learn from him.

By Jong H S

Oct 1, 2017

At the time of writing this review, I have completed 3 of the 5 courses. I personally think these 3 courses are not merely courses to fill up the specialization. It is a journey, an incredible one. I will write metaphorically. My journey so far is like becoming a magician with Course 1 on how to become one, then went on to Course 2 to learn from the master magicians, their secrets revealed and Course 3 on what to do to put up a good show at Las Vegas trying to fool Penn and Teller. This specialization is my treasure vault. Great job to Prof Andrew Ng and team.

By Vincent F

Jan 23, 2018

This course provided me with an understanding of the large number of hyper parameters that have to be tuned during a deep learning project. It gave me an insight on when different techniques like regularization and (the many different forms of) optimization need to be applied. The only quibble I have is that the material on the choice of the number of layers and the number of hidden units per layer was thin. Given that these values have a great impact on the speed of progress in a deep learning project I would have liked to have seen a little more emphasis on them.

By Ernest S

Nov 5, 2017

This course offers ground knowledge in all mayor concepts of non-recursive neural network and is excellent preparation to further exploring of this topic. Lectures cover broad choice of topics and discusses many problems you might encounter during your journey. Professor Andrew Ng explains theory in a way which builds good intuition and gives you building blocks for face the challenges of machine learning. If you are fluent with calculus or have academic background and expect to discover math behind the scenes I think you will be content too. I surely was.

By Aditya B

Jan 12, 2019

The concepts has been explained in a fantastic way. But few suggestions:

-> After every lesson, I would love to have more pop quizes. This was the case with course 1, but I didnot get any pop quizes for this one.

-> In the quiz assignment, it would be nice to have an explanation or justification section, which will explain that why the option selected is a correct one and why the other options are incorrect. I know we can have the same discussion in the forums, but such an explanation ( one liner should be fine) can provide a good instant knowledge boost!

By Rob S

Jun 9, 2018

Another very well done course. You do a good job describing the benefits of Batch Norm, a lot more intuitively than presented in Szegedy's paper, which is pretty math heavy. However, I did notice one little ERROR on the Tensorflow project page, albeit an insignificant one. Double check that the expected output shape for the cell that outputs the shape of the training set and testing set. One of the expected outputs said that the test set should have 10 possible classes, when the dataset is for 0-5 fingers. This would be a very strange looking hand ;)

By David M

Aug 31, 2017

This is a practical course on how to work with neural networks. It covers a collection of "tips" and techniques, all grounded on a solid theoretical framework, to make a classifier train faster and be more accurate. The explanations are all engaging and interesting, and the assignments are rather easy.

The knowledge gained from this course is probably what everybody working in machine learning already knows, but if you are new to the field this is a great way to get up to speed fast and start implementing neural networks for your own projects.

By Jairo J P H

Feb 1, 2020

El curso es muy bueno, particularmente estoy muy agradecido con COURSERA, por darme la oportunidad de hacer los cinco cursos de la Especialización en Deep Learning con ayuda economica y permitirme tener acceso a este tipo de capacitacion y certificacion. Muchas Gracias…!

The course is very good, particularly I am very grateful to COURSERA, for giving me the opportunity to do the five courses of the Deep Learning Specialization with financial aid and allowing me to have access to this type of training and certification. Thank you very much!

By Yash P

Dec 29, 2020

The First course was the easiest of all the tutorials I could have found on the internet. Andrew Ng has taught it very well, and it's best suited for beginners. The second course has delved deeper into understanding various Optimization Algorithms and improving Deep Learning models by tuning Hyperameters and regularization.

I would strongly recommend you to take this course. It's a very beginner-friendly course, so no need to worry. If you have guts and passion for it, then what are you waiting for, just enroll...!!!!

By DANTE K

Dec 1, 2020

This course began similar to the first one in the Specialization, repeating lots of material from Andrew's ML course, but after the first week there's a lot of new material introuced. Andrew shows lots of techniques taken from recent papers that have had much success, which is something you probably won't see in ANY other DL course. Loved the intro to TensorFlow in the last week, really good job at explaning and using the basics without getting too bogged down on the details. Can't wait to do course 3!

By GEORGE A

Mar 5, 2019

Pretty solid class, learned a lot of basic concepts. The class won't go into a lot of mathematical details about the algorithms however, there is enough intuition provided in order to understand the inner workings of the algorithms and the logic behind them. The only con I have is that some of the programming exercises look outdated with the current versions of the notebook. For example, in my last exercise I couldn't make the NN with tensorflow to work properly but got 100/100 nevertheless.

By Matei I

Feb 1, 2019

This course covers details about neural network implementations that are extremely useful in practice. In fact, after completing week 1 and learning about vanishing gradients, I was finally able to debug a NN implementation that I had been struggling with. I'm also grateful for the introduction to Tensorflow. As with the previous course in this specialization, expect to be spoon-fed during the programming assignments. The course would be better if it let you think more during assignments.

By Pablo G G

Sep 10, 2020

Awesome introduction and guidance about where to tweak your model...altho in my expirience Adam is all you are going to need. Missed some teachings about fine tuning thought iterations with scheudeles! Tensorflow has this funciong than can adapt on the go your parameters so your optimization can push that loss lower and lower. Adam optimizer works like charm with an schedule for learning rate!!(https://www.tensorflow.org/api_docs/python/tf/keras/optimizers/schedules/LearningRateSchedule)

By 江小鱼

Feb 12, 2019

This time , I finished Regularzation, I think this is a interesting experience, for you can implement your alg step by step, I get some magic(not black magic) alg, like RMS, momentum and Adam. At last, the most fascinating is to construct Tensorflow, just like a pipeline, step by step , and every step was made by only one line, from forward (without backward) to the model, Tensorflow is really black magic.

(I have to say Tensorflow is a bit difficult, forgive my poor English, thanks )

By Nathan Y

Oct 16, 2017

Neural networks are not new. What we learned in this course is some of the critical implementation details/tricks from the past decades of making them work in practice. Going beyond gradient descent, types of regularization, hyperparameter searching we get to a set of robust tools that quickly find good solutions in extremely high dimensional spaces. As Professor Ng says, our understanding of optimization rules of thumb in low dimensional spaces doesn't carry over to deep learning.

By José A

Oct 30, 2017

Seamlessly continues the previous course. If you know the basic structures of Neural Networks, how to initialize weights. Sigmoid, Tangenth, activations, and so forth, this will help you understand terms such as L2 regularization, gradient descent with momentum, RMSProp, Adam, Exponentially weighted averages, and many others.

Don't let the 3 weeks set you off. It has a lot of micro-content material that works on top of the previous work. Thanks to all the mentors for this great course.

By Raimond L

Aug 20, 2017

Really nice course. A lot of good information about how to prepare and divide data for training, hyperparameters optimization strategy, regularization techniques, learning algorithms, mini-batches, batch-normalization and more... Very useful information with clear explanation !!! Highly recommended course.

Very positive course, except tensorflow practical assignment, which caused some stress, because for me that framework is a bit alienating, forcing to look into manual every minute.

By P M K

Nov 26, 2017

Hi, The course content was definitely good and it helped to understand a lot of internals quite easily. I, however have one suggestion, the Introduction to Tensor Flow looked quite fast and could have been done in a better way by giving more slides about TensorFlow and then going on to the examples. Please ensure that you correct any errors pointed by the members taking this course, so that it benefits others and avoids wasting of time and reduces frustration at times.

Regards, PMK

By Sachita N

Jun 18, 2018

Professor Ng explains the most complicated concepts in the most intuitive fashion I have ever seen. The explanations are simple, straightforward and they encompass so many perspectives and alternatives to doing things. The exercises are immensely educational - they strike a great balance between guiding the student and letting them figure stuff out on their own. This is a great specialisation and I would whole-heartedly recommend it for anyone wanting to start with Deep Learning

By kindalin

Jul 31, 2019

This is the best course I have ever seen. The previous mooc class gave me some bad impressions, which is be created by some scholars for KPI. I believe that such a well-designed course will eventually replace the traditional curriculum. This is also a good hope for our students in non-brand schools.

The only downside is that the coursework instructions are too detailed as many people reflect. I can see a lot of good and hard designs in it, but I hope it can have a better form.

By Joppe G

Aug 13, 2017

This course is simply brilliant. You start with implementing the low-level functions that make up a deep learning framework. It's only in the last assignment that you explore TensorFlow. At that point, you have a full understanding of what the API encapsulates.

This really gives you confidence in your capability to get started with your own projects, knowing that you can come back at any time to brush up on some of the lower-level details.

Thank you Andrew and the whole team!

By RAJEEV B

Nov 17, 2017

The assignments are very good. All the parameter update methods are explained in a very good manner. I would recommend it very strongly for anyone who is looking for an in depth understanding of why we do what we do for tuning, regularization, optimization of NN. All the implementation in the assignments is also from scratch, so, that really helps a lot. I felt this is better than Stanford CS231n course material, after all this is a whole course on this specific purpose :).

By Marcel M

Jun 1, 2018

This course a practical way of fine tuning your model in order to improve on its performance. Rather than Deep Learning being a "so-called" black box. It turns out that Machine Learning models are not black boxes but rather there are proven techniques of not only finding out what happens in them but also to fine tune them in a systematic manner in order to improve on their results. It is an excellent course for the practical Deep Learning Engineer. Good Job and Keep It Up!

By Artem M

Apr 22, 2018

Found a lot of interesting details about NN that I did not know. This is a much better course than the first one. IncludesTensorflow exercises, which is useful. Nevertheless, proofs are still omitted for some results like initializations. It is not hard to google, but I bet lecturers could explain them much faster than diving into science literature. Otherwise, intuitional explanations of Adam using exponential smoothing, or physics analogy of momentum are just brilliant.