Chevron Left
Back to Natural Language Processing with Sequence Models

Learner Reviews & Feedback for Natural Language Processing with Sequence Models by DeepLearning.AI

4.5
stars
1,094 ratings

About the Course

In Course 3 of the Natural Language Processing Specialization, you will: a) Train a neural network with GLoVe word embeddings to perform sentiment analysis of tweets, b) Generate synthetic Shakespeare text using a Gated Recurrent Unit (GRU) language model, c) Train a recurrent neural network to perform named entity recognition (NER) using LSTMs with linear layers, and d) Use so-called ‘Siamese’ LSTM models to compare questions in a corpus and identify those that are worded differently but have the same meaning. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

SA

Sep 27, 2020

Overall it was great a course. A little bit weak in theory. I think for practical purposes whatever was sufficient. The detection of Question duplication was a very much cool model. I enjoy it a lot.

AB

Nov 11, 2021

This is the third course of NLP Specialization. This was a great course and the instructors was amazing. I really learned and understand everything they thought like LSTM, GRU, Siamese Networks etc.

Filter by:

201 - 225 of 230 Reviews for Natural Language Processing with Sequence Models

By Jakob U

Dec 18, 2022

Very mediocre course and specialization, in my opinion. All of the topics in this course are much better covered in the Deep Learning specialization by Andrew Ng. There is no didactic element here, it seems like the lecturer is just reading out bullet points from a script. On the practical side, the examples in courses 3 and 4 all use Trax, which I am finding to be a very questionable choice (in real life, there are Tensorflow/Keras and PyTorch, so it would make sense to learn how to implement these models in one of those libraries while also getting valuable practical experience with them; While PyTorch has great documentation and Tensorflow's documentation is at least okay, the documentation for Trax is really quite minimal and you cannot find examples anywhere, neither in the documentation nor on Stackoverflow). The examples are hardly more complex than what you have to do on DataCamp.

By Tanguy d L

Mar 12, 2023

Feels like the whole course is designed to achieve the assignement, not focused on explaining the core concepts or giving inuitions behind the ideas. For example, the lab notebooks (non graded notebooks) are basicually chunks of code giving the answers for the assignment. The assignemnt are very well done though.

Andrew Ng does a better job in giving intuitions, perhaps by going back and forth between concepts, by referring to research papers and trying to explain the basic idea behind them. Besides, the course gives no references.

Minus point : the course uses trax, a deep learning framework developped by google brain which is not maintained anymore. Keras looks pretty similar and would have been preferable.

By Yaron K

Apr 29, 2022

The 4th week on Siamese networks was well done. The Weeks on RNN GRU and LSTMs basically gave the equations and some intuition but most of the emphasis was on building a model with them using Googles TRAX Deep learning Framework model. Which the lecturers believe to be better than Tenserflow2. At least when it comes to debugging - it isn't. Make the smallest error (say with shape parameters) - and you get a mass of error messages which don't really help. Now at least for shape errors there is no excuse for this - since all that is needed is to run checks on the first batch of the first epoch that pinpoint exactly where there's a shape discrepancy.

By Amlan C

Oct 9, 2020

Despite the theoretical underpinings I do not feel this course lets you write an NER algo on your own . Majority of these courses have been using data Whats supplied by coursera and so is the case with models. In real life we have to either create this data or use some opensource data like from kaggle or whatever. I think it'd be better if we orient the course using publicly available appropriate data and models trained by students to be used for actual analysis.

By Maury S

Mar 8, 2021

Like some of the other courses in this specialization, this one has promise but comes off as a so far somewhat careless effort compared to the usual quality of content from Andrew Ng. The lecturers are OK but not great, and it is unclear what the role of Lukasz Kaiser is beyond reading introductions to many of the lecture. There is a strange focus on simplifying with the Google Trax model at the cost of not really teaching the underlying maths.

By Eyal H

Dec 28, 2022

The third course in the NLP specialization has been a bit of a disappointment to me. I feel that the video lectures are very robotic and don't add much to the written lectures, mainly reading out formulas and technical details, which can be tedious, especially when articulating any subscript or superscript detail in a formula rather than providing some intuition.

By Petru R

Apr 13, 2022

The course requires a solid background on deep learning, it does not explain in detail the LSTMs or how is the programming part keeping the weights of the 2 parts of the siamese network identical.

Is Trax providing other ways of generating data for siamese networks for training other than writing a custom function?

By Business D

Dec 14, 2020

I regret a lack of proper guidance in the coding exercises, compounded with the incomplete documentation of the trax library. I also feel we could build models with greater performance. An accuracy of 0.54 for the identification of question duplicates doesn't seem to be the state of the art...

You could do better!

By Rajaseharan R

Mar 9, 2022

Too much focus on the Data generator in the assignments. There should be a library function in Trax to do it. Might have to do some data preparation before hand but the generator should be a standard library function. Also, I hoped to learn a bit more indepth in terms of entity labelling.

By Huang J

Dec 23, 2020

The course videos are too short to convey the ideas behind the methodology. It requires understanding of the methodology before following the course material. Also, the introduction on Trax is fine, but would prefer to have a version of the assignments on TensorFlow.

By Irakli S

Jul 2, 2022

Good videos, however assignments weren't up to par with videos. Often I had to write assignments that weren't very much related to the videos and the stuff that were actually in the videos was already implemented.

By Vijay A

Nov 13, 2020

Good course teaching the applicatons of LSTMs/GRUs in language generation, NER and for matching question duplicates using Siamese networks. Would have been more helpful if there was more depth in the topics.

By J N B P

Mar 16, 2021

This course is good for practical knowledge with really good projects but it lags in the theoretical part you must be familiar with the concepts to get the most out of this course.

By Nguyen B L

Jul 5, 2021

I am now confusing by too many Deep learning framework. Also the content is somehow repeated with the Deep learning specialization.

By shinichiro i

Apr 24, 2021

I just want them to use Keras, since I have no inclination to study new shiny fancy framework such as Trax.

By YuLin D

Jul 22, 2022

Great Course! But it would be better if use Tensorflow or Pytorch, Trax is not very friendly to Mac users.

By martin k

Apr 26, 2021

Lectures are quite good, but assignments are really bad. Not helpful at all

By Deleted A

Jan 3, 2021

assignments were easy and similar.learned less than expected.

By Alberto S

Nov 1, 2020

Content is interesting, but some details are under explained.

By Ashim M

Nov 22, 2020

Would've been better with a better documented library.

By Mahsa S

Mar 26, 2021

I prefer to learn more about nlp in pytorch

By Mauricio B

Dec 1, 2023

The assigment and labs are not great :/

By Leon V (

Sep 28, 2020

Grader output could be more useful.

By Paul A

Oct 6, 2022

The first two courses were OK and I was looking forward to doing the next one. The instruction for this is not bad, although most of it is about neural networks as they applied to NLP. NLP is not a major topic, honestly. But you do get some introduction to some concepts in neural networks.

The problem I have dealt with is that for this course is that the code does not work on Windows. The course uses a library called trax which has a dependency that won't work on Windows. Since I find it very helpful run all the code locally in my IDE and visually see the data and variables in the debugger instead of attempting to run everything in Jupyter, being able to run it locally is very important, at least to me.

I have spent a lot of time trying to get it to work. First on my laptop before realizing it won't work, then on an old Linux computer and my newer Mac Mini (does not work on my Mac like the first notebook claims). "Install WSL on your Windows". "For M1, all you have to do is build jaxlib from sourse. And maybe tensor flow." etc. etc. Because we all have nothing better to do than to dig into online forums trying to figure out which files to download and which commands to run, when the last thing you tried didn't work.

In summary, the videos aren't bad but if you use Windows and like to run the assignments locally, you may want to find another course.

By Hùng N T

Feb 26, 2024

Everything was good except that this course uses Trax. This framework has yet to have any new releases since 2021, and I cannot manage to train deep learning models using Trax on my GPU, not even possible in Colab. Trax is also very buggy and it does not have a large community to help. Recommendation for learners: take the course after it is fully rebooted to TensorFlow unless you want to take other courses to get useful/working code for NLP.