Chevron Left
Back to Generative AI with Large Language Models

Learner Reviews & Feedback for Generative AI with Large Language Models by DeepLearning.AI

4.8
stars
2,270 ratings

About the Course

In Generative AI with Large Language Models (LLMs), you’ll learn the fundamentals of how generative AI works, and how to deploy it in real-world applications. By taking this course, you'll learn to: - Deeply understand generative AI, describing the key steps in a typical LLM-based generative AI lifecycle, from data gathering and model selection, to performance evaluation and deployment - Describe in detail the transformer architecture that powers LLMs, how they’re trained, and how fine-tuning enables LLMs to be adapted to a variety of specific use cases - Use empirical scaling laws to optimize the model's objective function across dataset size, compute budget, and inference requirements - Apply state-of-the art training, tuning, inference, tools, and deployment methods to maximize the performance of models within the specific constraints of your project - Discuss the challenges and opportunities that generative AI creates for businesses after hearing stories from industry researchers and practitioners Developers who have a good foundational understanding of how LLMs work, as well the best practices behind training and deploying them, will be able to make good decisions for their companies and more quickly build working prototypes. This course will support learners in building practical intuition about how to best utilize this exciting new technology. This is an intermediate course, so you should have some experience coding in Python to get the most out of it. You should also be familiar with the basics of machine learning, such as supervised and unsupervised learning, loss functions, and splitting data into training, validation, and test sets. If you have taken the Machine Learning Specialization or Deep Learning Specialization from DeepLearning.AI, you’ll be ready to take this course and dive deeper into the fundamentals of generative AI....

Top reviews

OK

Jan 28, 2024

Easily a five star course. You will get a combination of overview of advanced topics and in depth explanation of all necessary concepts. One of the best in this domain. Good work. Thank you teachers!

C

Jul 10, 2023

A very good course covering many different areas, from use cases, to the mathematical underpinnings and the societal impacts. And having the labs to actually get to play around with the algorithms.

Filter by:

476 - 500 of 609 Reviews for Generative AI with Large Language Models

By Oleksandr F

May 23, 2024

Excellent

By Karthik R

May 8, 2024

Its good!

By Anindita D

Sep 23, 2023

Very Good

By Nizamudheen T

Sep 10, 2023

Thank You

By Shuxiang Z

Jul 24, 2023

Loved it!

By Maciej J

Jan 9, 2024

Awesome!

By David G G G

Jun 29, 2023

Amazing!

By Abdullah B

Mar 20, 2024

Perfect

By Vipul C H

Nov 30, 2023

thanks

By Praveen H

Sep 25, 2023

superb

By Justin H

Sep 2, 2023

Brutal

By Николай Б

Jul 30, 2023

Greate

By Simone L

Aug 21, 2023

Super

By mehmet o

Aug 6, 2023

great

By Khawla E

Mar 30, 2024

good

By Buri B

Mar 3, 2024

nice

By Pawar N R

Feb 25, 2024

good

By zed a

Jan 24, 2024

good

By Padma M

Dec 11, 2023

good

By Fraz

Dec 10, 2023

All the instructors were good and delivery was mostly excellent, however, the course was a bit too short can be improved in several ways. There were very few quizes in the video lectures and the ones that were present, were too easy or obvious (does not require much thinking). There should be good, quality quizes in most video lessons similar to the OG ML course by Andrew Ng. The inline quizes in videos help "reinforce" the learning in humans. This is proven by the research yet to be carried out :D Another aspect that I did not like was the jupyter notebooks to run excercises, all solutions were already provided and it does not help in learning the concepts if all we have to do is to press Shift+Enter and merely observe code and results. Actual learning requires some trail and error as part of the exercises, once again the OG ML course by Andrew Ng did a good job of accomplishing this with Octave exercises.

By Jose M L M

Nov 2, 2023

A delightful and very up-to-date (most of the references have been published in the last 2 years) overview of LLMs with hands-on lab sessions in Python. Prompt engineering, zero/one/few-shot inference, instruction fine tuning (FT), parameter-efficient FT (PEFT), Low-rank Adaptation (LoRA), RL from human feedback, program-aided language (PAL) models, retrieval augmented generation (RAG), etc, etc. In short, everything you need to know about the state-of-the-art in LLMs in 2023. There are a couple of things that disappointed me though. The first one is that, unlike other Coursera courses, there isn't any discussion forum to interchange ideas with other students or post questions. The second one is that there isn't any clear contact (either from the course's intructors or from Coursera) to ask questions regarding problems with the AWS platform when working on the labs.

By Sun X

Sep 15, 2023

Good entry-level course in general. Thanks to the course team for bringing us one of a few online courses on this timely topic.

I really like the lab sessions. Although it can be further improved by adding some exercises, like writing the code for the whole LLM task.

Proximal Policy Optimization lecture by Dr. Ehsan Kamalinejad is fantastic. It helps me real the PPO paper with both quantitative and intuitive understanding. In comparison, the sections of some important LLM architectures, such as the Transformer and InstructGPT, is a bit too much intuitive.

The final week is way too packed. Students need to know more than just names and a short intro of new LLM techniques or architectures. It would be better to have separate Lab for each topic (such as PTQ, RAG, etc.) for learners to REALLY understand what's going on.

By Rohith K

Dec 31, 2023

Good overview of the different stages of developing an LLM application. I felt it gave me enough knowledge to be able to understand the current research and applications that are being developed for large-language models. The course gives you links to relevant papers that you can read for more in-depth coverage on how some of the latest LLMs are constructed and trained. I would have liked the labs to be more hands on. You basically run pre-built lab notebooks that use existing model implementations from widely available libraries like HuggingFace. There was no requirement to write any code.

By Mike R

Aug 14, 2023

This is a good introduction - the lectures are very good and cover many critical aspects of training LLMs such as PEFT, RLHF and challenges with scaling and deployment. The lab notebooks go through the lecture ideas though you just need to run the notebook - there is no exercises in the labs for you to do which is why I have given 4 stars as usually the labs are where the learning happens.

I hope that this turns into a specialisation with exercises as the teaching team really know their stuff and there are literally almost no other alternatives to get curated learning in LLMs right now.

By Eliu M M

Mar 1, 2024

A course with a lot of new information, very well explained even for people with no prior knowledge of the subject. It's not an introductory course on the theory; there's a lot of technical explanation of the LLM core. In the end, I can say that I understand much better what LLMs are, how they are structured, their architecture, how they are trained, and aligned to strive for improved capabilities. The labs are very supportive, providing enough guidance to understand the direction of the programming and what needs to be done, but undoubtedly, I couldn't program anything related to it.