Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
1,006 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

RJ

Invalid date

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

SB

Invalid date

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

Filter by:

226 - 246 of 246 Reviews for Natural Language Processing with Attention Models

By Azriel G

Nov 19, 2020

The labs in the last two courses were Excellent. However the lecture videos were not very useful to learn the material. I think the course material deserves a v2 set of videos with more in depth intuitions and explanations, and details on attention and the many variants, etc. There is no need to oversimplify the video lectures, it should feel as similar level as the labs (assignments tend to be "too easy" but I understand why that is needed). Thanks for the courses. Azriel Goldschmidt

By Kota M

Aug 23, 2021

This course perhaps gives a good overview of the BERT and several other extensions such as T5 and Reformer. I could learn the conceptual framework of the algorithms and understood what we can do with them. However, I think the instructors chose an undesirable mix of rigour and intuition. The lectures are mostly about intuition. In contrast, the assignments are very detailed and go through each logical step one by one.

By Nunzio V

Apr 7, 2021

Nice course. Full of very interesting infomation. What a pity not having used Tensorflow. All that knowledge is unfortunately not work-ready as Trax is not widespreadly used in the industry world and it is hardlyit will ever be. In my opinion.

By Artem A

Aug 9, 2021

Explanation of Attention models with Attention mechanism itself and other building blocks of the Transformers was very confusing. It was really hard sometime to udnerstand what the lecturer really meant.

By Michel M

Feb 9, 2021

The presented concepts are quite complex - I would prefer less details as most will not understand them anyway and more conceptual information why these models are build as they are

By Zeev K

Oct 24, 2021

not clear enough. the exersices warent good enough' i didn't learned from them much. it could be a great idea to give the slides at the end of every week for reapet.

By Huang J

Dec 23, 2020

Course videos are too short to convey the ideas behind the methodology. Illustration is too rough.

By Maury S

Mar 13, 2021

Another less than impressive effort in a specialization from which I expected more.

By Prithviraj J

Dec 21, 2020

Explanations of attention/self-attention & other complex topics are too shallow

By Anurag S

Jan 3, 2021

Course content more detailed explanation to follow.

By ABHISHEK T

Apr 24, 2023

elaborate and make it easy to learn

By Przem G

Feb 18, 2023

I would not understand much if I haven't known most of the material beforehand. Lots of repetition (not bad, just boring), but worse, bugs as well. Many times the lecturer doesn't know what he's talking about, and messes things up. Characteristic moment is when all of a sudden he talks about things without definition (like "shared frame", "adapter", or shows a diagram contradicting the code besides, etc.), or changes subject abruptly.

The grader is terrible crap happily returning errors but no explanation. You teach AI, you talk about LMs beating humans, yet the tool used for evaluating your students is so primitive as if written two decades ago. It's very likely that it infuriates everybody except its proud author. Either the code to fill is trivial (we learn nothing), or it requires mental work which potentially leaves some traces. The effect is that code works fine, but the grader fails miserably.

Like many of your courses, this one too teaches us more about the author's horizon and expectations, than new knowledge we pay for. This is particularly evident during quizzes where poorly formulated questions, answerable only in narrow context, abound. Also bugs like "translating french to english" require to mark "keys and values are the french words"...

By Yue W G

May 24, 2021

The content is good because it covers many aspects of NLP. There are a lot of illustrations provided to help students understand the materials. However, the assignments are too easy because of the detailed comments provided. This makes it too easy because students could simply copy and paste the answers from the comments.

One suggestion is to improve the explanation of the materials because there re lots of details being skipped by the instructors. Personally, I would have to read other blogs in order to understand some of the details. Furthermore, separating the solutions from the codes is definitely something that must be done for instance presenting the solution in a separate notebook.

By Vitalii S

Jan 25, 2021

1) Information 3 out of 5:

no in depth explanations.

2) quiz are too easy, and I was missing good quizzes that were proposed at DL specialization with use cases, they cause me to think what to pick.

3) home tasks are 1 out of 5:

3.1 First of all all home tasks are done in different manner.

3.2 Some of them require additional check even all tests were passed.

3.3 Part with google collab is also a little bit strange... I want to have 1 click away home task and not setting up 3-rd party env.

What is good: for high - level overview this course is ok. Maybe have 2 versions of the course one with in depth explanations. and one more like this one.

By Arun

Feb 18, 2021

Compared to Andrew Ng's deep learning specialization, this course requires a lot of improvement. Very often disparate facts are put together with not much connection between the ideas. This is probably because of the enormous amount of content covered. It might make sense to split the course into two. Thank you!

By Steven N

Apr 29, 2021

The course lectures were very confusing, and the course assignments were too easy, so they didn't reinforce the lecture concepts in the same way that assignments from other courses had.

By Mohsen A F

Oct 24, 2020

Like: State-of-the-art NLP problems to be used in the industry.

Dislike: Topics were not well-explained. Difficult to grasp

By 一田木

Jul 23, 2023

Trax is no longer in active development.

By 何雪凝

Aug 7, 2024

Unclear explanation everywhere. Homework assignment cannot be graded

By Ignacio d l S

Jan 8, 2022

Too easy. I can say I almost didn't learn anything.