Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
984 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

LL

Jun 22, 2021

This course is briliant which talks about SOTA models such as Transformer, BERT. It would be better to have a Capstone Project. And entire projects can be downloaded easily.

Filter by:

201 - 225 of 240 Reviews for Natural Language Processing with Attention Models

By Anonymous T

Oct 15, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

By Qiao D

Nov 4, 2022

The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.

By Moustafa S

Oct 3, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

By Mohan N

Mar 28, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

By Rahul J

Sep 29, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

By veera s

Mar 18, 2022

need more detailed explanation in the last course of this specialization, especially Attention and BERT models.

By Vaseekaran V

Sep 20, 2021

It's a really good course to learn and get introduced on the attention models in NLP.

By David M

Oct 25, 2020

An amazing experience throughout the state-of-art NLP models

By Roger K

May 17, 2022

Labs required a bit more context, to understand.

By Shaojuan L

Dec 18, 2020

The programming assignment is too simple

By Fatih T

Feb 4, 2021

great explanation of the topic I guess!

By Sreang R

Dec 22, 2020

Awesome course

By Erik S

Dec 23, 2023

Overall I found this course underwhelming, especially in comparison to the other three courses in this specialization. The lecture videos don't seem to flow coherently, lots of terminology is introduced without being defined, there seems to be hand-waving over details, and it feels a bit infantilizing when videos ends with statements like "you now understand the T5 architecture" or "you now know how to fine-tune transformer models" when those concepts are not really explained in any meaningful detail in the videos. There also seems to be a big gap between how complex/advanced these topics are and how trivially easy the programming assignments are, with most of the logic implemented for us; completing most exercises doesn't require much more than reading comments to replace "None" values or copying code from preceding cells. In previous courses in this specialization the assignments felt like truer assessments of what we've learned. I hope this course gets a refresh for future students!

By Azriel G

Nov 20, 2020

The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt

By Kota M

Aug 23, 2021

This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.

By Nunzio V

Apr 7, 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

By Artem A

Aug 9, 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

By Michel M

Feb 9, 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

By Zeev K

Oct 24, 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.

By Huang J

Dec 23, 2020

Course videos are too short to convey the ideas behind the methodology. Illustration is too rough.

By Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

By Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow

By Anurag S

Jan 3, 2021

Course content more detailed explanation to follow.

By ABHISHEK T

Apr 24, 2023

elaborate and make it easy to learn

By Przem G

Feb 18, 2023

I would not understand much if I haven't known most of the material beforehand. Lots of repetition (not bad, just boring), but worse, bugs as well. Many times the lecturer doesn't know what he's talking about, and messes things up. Characteristic moment is when all of a sudden he talks about things without definition (like "shared frame", "adapter", or shows a diagram contradicting the code besides, etc.), or changes subject abruptly.

The grader is terrible crap happily returning errors but no explanation. You teach AI, you talk about LMs beating humans, yet the tool used for evaluating your students is so primitive as if written two decades ago. It's very likely that it infuriates everybody except its proud author. Either the code to fill is trivial (we learn nothing), or it requires mental work which potentially leaves some traces. The effect is that code works fine, but the grader fails miserably.

Like many of your courses, this one too teaches us more about the author's horizon and expectations, than new knowledge we pay for. This is particularly evident during quizzes where poorly formulated questions, answerable only in narrow context, abound. Also bugs like "translating french to english" require to mark "keys and values are the french words"...