I really enjoyed this course, especially because it combines all different components (DNN, CONV-NET, and RNN) together in one application. I look forward to taking more courses from deeplearning.ai.
It was an amazing experience to learn from such great experts in the field and get a complete understanding of all the concepts involved and also get thorough understanding of the programming skills.
By Al F N P M•
By Indah D S•
By Ahmad H N•
By shree h•
By RAGHUVEER S D•
By Abdulaziz A J•
By John K•
Very good way to get familiar with Tensoflow - it's pluses as well as its minuses.
Good overview of applying tf.keras to this topic. Machine learning is clearly a practical discipline (i.e. theory alone will not get you there), so I appreciated the chance to write some code and read a decent amount of code.
Laurence Moroney is a good, upbeat instructor.
All the courses within the Tensorflow in Practice specialization on Coursera may be most beneficial after first taking Andrew Ng's course on AI (also Coursera), but if you know something about loss functions, gradient descent, and backpropagation (which can be learned quick-and-dirty online), then you should be fine to go ahead and take this specialization before Professor Ng's course.
My one persistent wish for all four of the courses in this specialization is that significantly more time be spent on understanding the shapes of tensors as they flow through the models. Invariably, the only areas that gave me real problems as I did the coding homework were those where my tensor shape did not match what the model needed to see. Documentation at Tensorflow.org was of little help with this topic. Looking at Stackoverflow, it is apparent that there are certain (unwritten?) facts about the order and count of dimensions for the tensors as they flow through, e.g. batch count is listed first, time step is second, frame is third, or something like that. What if I have twelve dimensions in my tensor? Do certain model layers require a minimum number of dimensions of input or output? etc. etc.
Finally, this specialization really teaches the tf.keras framework, not Tensorflow itself, which I do not think was explained in the course info, but maybe I missed it. Still - keras is probably a good way to enter the subject.
All in all, I do know a lot more than I did before, and have acquired new skills. Clearly, there's more to work on, which is good.
By Egor E•
I like very match the first and second week of the course, because it contains dense new theoretical and practical things. The idea of time series forecasting and preparing windowed dataset was explained very clear and was very usefull for all next lessons. Also the difference between statistic and neural network approaches was very helpful.
The 3 and 4 week I would prefer zip in one , because the experiments with RNN, LSTM and Conv is very familiar and actually I've done them together one by one. I would pleased to learn some explanation and examples why each type of architecture follow their result. How the results depend on dataset preparation. Particulary, I did not get what architecture work better with seasonality, autocorrelations, and noise.
By Xiang J•
I think overall it is a good course, these are the things I learnt:
First-hand experience with tensorflow, but more focus on the basics of keras
Knows how to preprocess data for image, text, and times series to feed it into NN
Knows basic concepts of keras layers such as CNN, LSTM, RNN, Conv1D, DNN
Knows learning rate rough gauge techniques
Things to improve:
Fix the typos, such as window[:1], there are a few posted in the forum
Should introduce more basics of tensorflow instead of kerasShould
include more links/documentation for the side knowledge, such as paddingAdding
some layers seems magical, such as Conv1D before LSTM for time series, what is the logic behind?
By José D•
In this final course of the Tensor InPractice Specialization, all pieces come together to solve a real world example (Kaggles' sunspots) using Keras (TensorFlow's high-level API). This course focuses on Time sequence, using CNN & RNN. As explained in the videos, this specialization is an introduction to Deep Learning using Keras. There is no math in all of the courses. As a result, if you want to understand why and how it works under the hood, you want to do the "Deep Learning" Specialization. As I did The DeepLearning one before this one, this whole specialization was like an addition exercise.
By Francisco F•
I wish this course was taught with real world data, which only happens in week 4. Rather, the course utilizes synthetic data, which is not as great in providing perspective as real data and real problems. Also, volume is really low, don't know why. Other than that, as always, great course. Great specialization! A pretty good intro to Tensorflow for the ones who haven't used it before and a nice recap of the basics for the ones like me who have been using and have missed some core concepts here and there.
By Muthiah A•
I enjoyed the thoughtful exercises and measured experienced guidance of Laurence (who has been doing this for years now in big stage). It’s a bite sized introduction to Tensorflow aspects for busy professionals and while you can “game” the quizzes and earn completion, really the onus is on learner to spend time on reading materials and videos and great colab exercises. Google Colab notebooks are single outstanding reason this whole specialization is compelling to me.
Thanks everyone @ Coursera
By João A J d S•
I think I might say this for every course of this specialisation:
Great content all around!
It has some great colab examples explaining how to put these models into action on TensorFlow, which I'm know I'm going to revisit time and again.
There's only one thing that I think it might not be quite so good: the evaluation of the course. There isn't one, apart from the quizes. A bit more evaluation steps, as per in Andrew's Deep Learning Specialisation, would require more commitment from students.
By Edward T•
Its a shame that the assignments were not graded. What's the incentive to struggle and dive deep when the notebook is just a repetition of the lecture notebooks and the assignment is ungraded? This course would greatly benefit in making those assignments graded and bringing in multivariate, multistep forecasting into the mix! Overall, though, I did enjoy it and I learned a thing or two about modelling and signal processing. Need to continue on this journey!
By Mihail Y•
Very interesting course. Thanks to Laurance Moroney for the clear and concise way of presenting complex concepts. The only reason I am giving 4 star to the course is that I was really expecting to get more about the information on forecasting using multiple series as features and forecasting multiple series at once. I think it will be interesting to add to the course more on this, as it is still tricky for me to manage these tasks in Tensorflow.
By Shubham K•
Really nice introduction time series data analysis for regression and prediction. This course extends what you will learn in the rest of the specialisation (NN, Dense Layers, Convolutions, RNN, LSTM) to univariate time series data. I highly recommend this. Its very easy after you do rest of courses from specialisation. Good luck learning. And kudos to Laurence and Dr Andrew Ng for being a lovely instructors and making this accessible to all .
By Suhan A•
I liked the flow of the course, working on synthetic data and then moving to real data. But I also think it would be better if I had already taken Andrew Ng's Deep Learning Course before approaching this Course. Plus since there weren't any Graded Programming exercise, it didn't feel like I would be confident in making my own model. So I'm going back and taking Deep learning course.
By Tibor S•
Great course for a brief intorduction to time series predictions. One needs to integrate knowledge gained from somewhere else (i.e. the course is not comprehensive, but that is also not expected). What I was missing is clarification from authors of some of the important questions/comments in the forum. Several things from the course are left unexplained. Otherwise, I recommend it!
By Gerard S S•
First of all congratulations on the specialization. I felt that I have improved a lot my previous knowledge of Machine Learning and programming with Python and TS. One improving note:I felt that this course could go to third place in the specialization. You go deeper in CNN and LSTM which I missed in the previous one :)
Also, it would be great 2 examples of real-world scenarios
By Dustin Z•
Fun course, like the rest in the series. I hadn't seen neural networks applied to time series, so that was really worthwhile to learn.
There are still some rough edges and a few parts of the labs that aren't addressed in the videos.
I really enjoy the format of the courses which emphasized a lot of experimentation with networks and provided opportunity for trial and error.
By Eric L•
If one has seen LSTMs from the previous course and has been exposed to time series there is a little conceptual material to learn from this course, but of course the focus is on tensorflow/keras programming. Highlights were learning how to include lambda layers (which allow one to execute arbitrary code in the network) and how to automate selection of the learning rate.
By yuan j•
I learned some time series models and processing methods, but I think this course is too simple and too shallow, for example, there is no prediction of the situation that contains complex features, and how to combine autoregressive features with other features. Models in this course are very popular and are used by everyone, there is no deep stuff