Chevron Left
Back to Natural Language Processing in TensorFlow

Learner Reviews & Feedback for Natural Language Processing in TensorFlow by DeepLearning.AI

4.6
stars
4,745 ratings
732 reviews

About the Course

If you are a software developer who wants to build scalable AI-powered algorithms, you need to understand how to use the tools to build them. This Specialization will teach you best practices for using TensorFlow, a popular open-source framework for machine learning. In Course 3 of the deeplearning.ai TensorFlow Specialization, you will build natural language processing systems using TensorFlow. You will learn to process text, including tokenizing and representing sentences as vectors, so that they can be input to a neural network. You’ll also learn to apply RNNs, GRUs, and LSTMs in TensorFlow. Finally, you’ll get to train an LSTM on existing text to create original poetry! The Machine Learning course and Deep Learning Specialization from Andrew Ng teach the most important and foundational principles of Machine Learning and Deep Learning. This new deeplearning.ai TensorFlow Specialization teaches you how to use TensorFlow to implement those principles so that you can start building and applying scalable models to real-world problems. To develop a deeper understanding of how neural networks work, we recommend that you take the Deep Learning Specialization....

Top reviews

GS
Aug 26, 2019

Excellent. Isn't Laurence just great! Fantastically deep knowledge, easy learning style, very practical presentation. And funny! A pure joy, highly relevant and extremely useful of course. Thank you!

AS
Jul 21, 2020

Great course for anyone interested in NLP! This course focuses on practical learning instead of overburdening students with theory. Would recommend this to every NLP beginner/enthusiast out there!!

Filter by:

701 - 725 of 729 Reviews for Natural Language Processing in TensorFlow

By Aladdin P

Aug 5, 2020

The material was better in this course than the previous ones, but still lacking depth in my opinion. Also, no graded assignments?? So the focus is then only on the quizzes, and they are not even well done. From week to week the same questions are repeated and the quizzes don't even include code: How is this teaching code?

By DAVID R M

Oct 4, 2020

This course was quite sloppily presented and superficial overall. There were a couple of longstanding errors that have never been fixed (see the lengthy discussions in forums). One thing that annoyed me was that the important concept of stop-words was not discussed at all, yet it was required for the first assignment.

By Tal F

Aug 13, 2020

All assignments were optional - probably because of all the problems with the scoring system for the previous course. Quizzes often asked things about the dataset we used (eg IMDB) rather than testing that we were learning concepts. Very little meat to the course - mostly links to other resources.

By Hartger

Sep 29, 2020

Overall the video material is fine. The assignments however are very unclear and contain bugs. The grader's test don't match the instructions. It's very frustrating that the assignments clearly haven't been given the same attention the rest of the course has been.

By Prosenjit D

Jan 16, 2020

This course is a far cry from Andrew Ng's deep learning specialization and refers to Sequence Models from that specialization at the drop of a hat. In short, no use doing this one, unless you have done sequence models (course 5) of deep learning specialization.

By Dominik B

Jun 10, 2020

No grader exercises,

sample code in the lectures isn't always updated and gives errors,

everything is a bit chaotic (eg order of sample code, sample code description, introduction to the topic is random; some random parts in the code).

By Venkata S Y T

Apr 4, 2020

The weekly exercises are not graded and the over all content quality of this course in comparison with the previous two in the specialization seems a bit poor and doesn't provide more learning on the topic.

By Amit K

May 25, 2020

Not clearly explained and only using toy and irrelevant datasets, nothing realtime industry specific examples. Also, voice quality is very bad for this course.

By Jurica Š

Nov 29, 2019

I would call this entry/beginner level material. There arent any graded coding challenges, which is a shame. No complex topics are covered with this class.

By jack c

May 26, 2020

It's a bit too basic and there are not many graded examples to work through like Andrew Ng's course. I feel it could have been more complete and in depth

By Graham W

Apr 8, 2020

Disappointing. Laurence much less able to explain NLP issues than CNN issues. Lots of problems with TF versions in Colabs wasted far too much time.

By Joey Y

Aug 5, 2019

The quality of the audio recording is worse than courses before. The questions at the end of the chapters are also repetitive.

By Amr K

Apr 23, 2020

didn't really feel like a strongly grasped the concept and needed more exercises also the lack of lessons notebooks.

By Maged A

Nov 15, 2020

Too short. Fine as introduction but not in depth course. No assignment except very shallow multiple choices tests.

By Benoît Q

Apr 27, 2020

Not enough content; far too easy; the whole course should be one week of a good tensorflow course.

By Nirzari D

Jul 6, 2020

The audio quality is very bad! It should be improved so the content is audible to the user

By Anton Z

May 14, 2020

I wish assignments were provided the same way as in the previous two courses

By Masoud V

Aug 22, 2019

Useful but shorter and easier than expected and not deep enough for me

By Jose R

Sep 5, 2020

It is too mechanical and reinforcement of concepts if very limited

By Md. M R

May 29, 2020

Good course. but we expected hands-on assignments to learn better

By Alexander S

Aug 23, 2019

Dont see the value behind predicting words.

By AasaiAlangaram

Dec 11, 2019

Not Much Information provided.

By Daniel C

Oct 27, 2020

Missing code evalution

By Jack P

Oct 17, 2020

Unfortunately, really disappointed with this course.

Having done the previous 2 courses int he specialisation I have come to realise that the courses are much more of a tutorial and could be seen as quick practice content before going for the TF Developer certificate or something. That in itself is fine, I feel there are other places to learn the maths/intuition behind DL (e.g. the Deep Learning Specialisation) but I feel especially in this 3rd course the content really doesn't justify paying for it.

For starters the explanations are very fast, hand wavy and don't go into any real depth other than just quickly explaining each line of the short notebooks (this can be useful). There is no discussion on how to improve the models or actually use this other than just pressing play in the notebook, the length of videos 17min per week is really not worth it, especially when better content can be found for free in Kaggle notebooks or on YouTube. There are also no graded exercises and after the first week they have given up on even providing suggested answers for the ungraded ones. The exercises don't test your TF understanding, just your basic Python loading of data and if you can copy from the example workbooks, they also have inconsistencies new untaught content and prone to errors that you haven't even been told could be an issue which means you just waste time being frustrated at not understanding what code you're even supposed to add rather than trying to understand the content.

I really like Coursera in general so this experience won't change that, but given that the instructor has free content on the TF website and youtube channel it seems like a waste to pay for this course IMO.

Hoping the 4th course will. be better

By Huet P

Oct 13, 2020

Videos are too short. Unlike Andrew did, there was not enough talk on intuition and how to tune the hyperparameters. There are a lot of redundant questions in the quizzes, and not enough explanations on the notebooks. I would prefer graded exams, not ungraded ones with answers. I would prefer the coursera lab instead of the google collab platform as we cannot access again previous works.