Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
1,006 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

RJ

Invalid date

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

SB

Invalid date

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

Filter by:

201 - 225 of 246 Reviews for Natural Language Processing with Attention Models

By Ankit K S

Nov 29, 2020

This is really an interesting specialization with lots of things to learn in the domain of NLP ranging from basic to advanced concepts. It covers the state of the art Transformer architecture in great detail. The only thing with which I felt uncomfortable is the use of Trax Library in assignments.

By Vijay A

Nov 20, 2020

Covers the state of the art in NLP! We get an overview and a basic understanding of designing and using attention models. Each week deserves to be a course in itself - could have actually designed a specialization on the attention based models so that we get to learn and understand better.

By Alexandre B

May 20, 2023

This course is quite complete as it presents major hot NLP tasks with transformers, but unfortunately it presents only one framework: Trax, and not Hugging Face's which is also really useful and used in the field. I would have liked to have a lesson about chatGPT like models.

By Naman B

Apr 28, 2021

It would have been better if we use standard frameworks like PyTorch instead of Trax. Also, the Course Videos are a bit confusing at times. It would have been great if the Math part would have been taught as Andrew Ng Taught in Deep Learning Course.

By Cees R

Nov 29, 2020

Not new to NLP, I enjoyed this course and learned things I didn't know before. From an educational perspective, I didn't like that the two "optional" exercises were way harder than the too easy "fill in x here" assignment.

By Zicong M

Dec 14, 2020

Overall good quality, but seems a bit short and content are squeezed.

I don't like the push of Trax neither, it is has yet become the mainstream and personally I don't find that helpful for my professional career.

By Jonas B

Apr 11, 2023

A good course, that I can recomment without a doubt. I would strongly recomment to complement it by reading the additional ressources (see week 4 -> "References") as well as hugging face tutorial for NLP.

By Gonzalo A M

Jan 21, 2021

I think that we could go deeper in the last course because you taught a lot of complex concepts but I did not feel confidence to replicate them. It was better to explain transformers with more detail

By Cornel M

Jun 18, 2023

The lectures need more insights to understand not only the 'how' but a reasonable amount of the 'why', too. Andrew is very good at doing this in his lectures and provide his intuitions and insights.

By Vishwam G

Mar 10, 2024

It could have been better if Transformers library from hugging face is explored more. and topics like Vision Transformers and utilization of Transformers for computer vision is explored.

By CLAUDIA R R

Sep 7, 2021

It's a great course, more difficult than I thought but very well structured and explained. Although more didactic free videos can complete the lessons from others websites.

By Anonymous T

Oct 15, 2020

great course content but go for this only if you have done previous courses and have some background knowledge otherwise you won't be able to relate

By Qiao D

Nov 4, 2022

The content is great, but it will be even better if we have a more in-depth understanding of the knowledge rather than a very quick crash course.

By Moustafa S

Oct 3, 2020

good course covers everything i guess, the only down side for me is trax portion, i would've prefered if it was on TF maybe, but still great job

By Mohan N

Mar 28, 2021

The course covers cutting edge content and the exercises are well paced. Found the transformer lessons a bit difficult to understand.

By Rahul J

Sep 29, 2020

Not up to expectations. Needs more explanation on some topics. Some were difficult to understand, examples might have helped!!

By veera s

Mar 18, 2022

need more detailed explanation in the last course of this specialization, especially Attention and BERT models.

By Vaseekaran V

Sep 20, 2021

It's a really good course to learn and get introduced on the attention models in NLP.

By David M

Oct 25, 2020

An amazing experience throughout the state-of-art NLP models

By Roger K

May 17, 2022

Labs required a bit more context, to understand.

By Shaojuan L

Dec 18, 2020

The programming assignment is too simple

By Fatih T

Feb 4, 2021

great explanation of the topic I guess!

By Sreang R

Dec 22, 2020

Awesome course

By Mark B

Jul 8, 2024

I hope it's ok to review the specialization as a whole here as I'd really like to give some overal context in one review :) ..... I'm sorry to have to write a negative sounding review of the back of covering this specialization given that I also appreciate the efforts involved, never mind the background expertise but if it were not for the high quality of the jupyter notebooks, I would have requested a refund. I have followed Machine learning material on Coursera before and the overall standard of instruction was significantly higher that I what I encountered here. I'm always after a balanced learning experience to help me ramp up on any given topic and while the notebooks were very comprehensively put together and were certainly helpful in completing the programing assignments associated with the last 2 courses even more so which btw. were clearly trickier (both conceptually as well as practically) than the first 2 courses, the lecture delivery wasn't great overall - Please take a leaf out of Andrew Ng's book here - When you are showing basic slides, ***annotate as you are talking***** or at least make use of animations rather than narrating through static slides alone - and for the non-math majors out there who also well positioned to cover this material once they have the stated pre-reqs, use more numbers to move from the more concrete to the more abstract - I've seen Andrew do it very well and any learner such as myself who is happy to invest time + money+ a heap of interest frankly deserves a more comprehensive as well as a more ***consumable delivery*** - I'm also, sad to say, seen far better introductions to LSTMs for example and to help me really appreciate their power as well as their shortcomings, I researched and follow better introductions elsewhere from several sources, never mind the transformer related topics. I was also disappointed at the overly brief treatment of the truly excellent hugging face resource at the time I covered this content - again, after taking more time out to explore how to work with huggingface and consult much better learning material on it to help me ramp up efficently, I felt that I had a proper introduction to it outside what was presented in the last course and that I'm afraid also disappointed me - even more so, given the expertise behind the material. It was great to have a good fine-tuning lab but even better to have a more generic way of working with preparing a data set tbh. that the one presented which is not exactly for everyone (or even most people I might suspect). The content at this stage felt like I was at the end of a day at some seminar where the presenter is running out of time and you end up with a sketchy whistle-stop tour of the platform + concepts - I'm sorry if this sounds harsh to some but unfortunately, given what this course could have been with the clear level of expertise behind it, the material as a whole was quite unbalanced - There were clearly too many individual learning points left to the programing assignments making some of the explanatory text overally long winded - felt really crammed in - I along with many other students should have been prep'd much better with more code samples particularly as when I took the course, I was more familiar with pytorch + keras as opposed to working directly with tensorflow - More instructional support would have been welcomed here given it wasn't on the pre-reqs list when I consulted it. I'm happy to have put plenty of time in effort but the instructional content should match up to the student's efforts - sorry. To avoid making this review much longer, I won't say anything about the quality of the quiz material - What's been written in the forums bears out the shortcomings more than sufficently in my view. Small point I know but there were numerous pop-ups immediately correcting the narrative - I'd always be ok with one or two but there were several that really interrupted the flow - low hanging fruit to fix so please fix and please bear the above points in mind. That said, I did learn and was very happy to so again, appreciate the efforts overall. The reason I took this course was of the back of Machine Learning material I followed previously on coursea which was of a very high standard and really helped me to use time better to research further - for this specialization though, I had to immediately research to get the basics down better to compensate for a less effective delivery. I'm really sorry to say all of the above as I appreciate the experienced minds and the efforts made - Whilst I certainly benefited from completing this specialization, the overall standard of delivery wasn't effective enough for me and I suspect, a significant number of folks relatively new to NLP topics. I would have loved nothing more than to give positive review but all of the above is more than fair comment I'm afraid - This course has the potential to be a fantastic resource with some considered instructional rework.

By Erik S

Dec 23, 2023

Overall I found this course underwhelming, especially in comparison to the other three courses in this specialization. The lecture videos don't seem to flow coherently, lots of terminology is introduced without being defined, there seems to be hand-waving over details, and it feels a bit infantilizing when videos ends with statements like "you now understand the T5 architecture" or "you now know how to fine-tune transformer models" when those concepts are not really explained in any meaningful detail in the videos. There also seems to be a big gap between how complex/advanced these topics are and how trivially easy the programming assignments are, with most of the logic implemented for us; completing most exercises doesn't require much more than reading comments to replace "None" values or copying code from preceding cells. In previous courses in this specialization the assignments felt like truer assessments of what we've learned. I hope this course gets a refresh for future students!