IBM
Generative AI Language Modeling with Transformers
IBM

Generative AI Language Modeling with Transformers

Taught in English

Course

Gain insight into a topic and learn the fundamentals

Joseph Santarcangelo
Fateme Akbari
Adrian Wang

Instructors: Joseph Santarcangelo

Intermediate level

Recommended experience

8 hours to complete
3 weeks at 2 hours a week
Flexible schedule
Learn at your own pace

What you'll learn

  • Explain the concept of attention mechanisms in transformers, including their role in capturing contextual information.

  • Describe language modeling with the decoder-based GPT and encoder-based BERT.

  • Implement positional encoding, masking, attention mechanism, document classification, and create LLMs like GPT and BERT.

  • Use transformer-based models and PyTorch functions for text classification, language translation, and modeling.

Details to know

Shareable certificate

Add to your LinkedIn profile

Recently updated!

May 2024

Assessments

6 assignments

See how employees at top companies are mastering in-demand skills

Placeholder
Placeholder

Earn a career certificate

Add this credential to your LinkedIn profile, resume, or CV

Share it on social media and in your performance review

Placeholder

There are 2 modules in this course

In this module, you will learn the techniques to achieve positional encoding and how to implement positional encoding in PyTorch. You will learn how attention mechanism works and how to apply attention mechanism to word embeddings and sequences. You will also learn how self-attention mechanisms help in simple language modeling to predict the token. In addition, you will learn about scaled dot-product attention mechanism with multiple heads and how the transformer architecture enhances the efficiency of attention mechanisms. You will also learn how to implement a series of encoder layer instances in PyTorch. Finally, you will learn how to use transformer-based models for text classification, including creating the text pipeline and the model and training the model.

What's included

6 videos3 readings2 assignments2 app items1 plugin

In this module, you will learn about decoders and GPT-like models for language translation, train the models, and implement them using PyTorch. You will also gain knowledge about encoder models with Bidirectional Encoder Representations from Transformers (BERT) and pretrain them using masked language modeling (MLM) and next sentence prediction (NSP). You will also perform data preparation for BERT using PyTorch. Finally, you learn about the applications of transformers for translation by understanding the transformer architecture and performing its PyTorch Implementation. The hands-on labs in this module will give you good practice in how you can use the decoder model, encoder model, and transformers for real-world applications.

What's included

10 videos6 readings4 assignments4 app items2 plugins

Instructors

Joseph Santarcangelo
IBM
28 Courses1,394,394 learners

Offered by

IBM

Why people choose Coursera for their career

Felipe M.
Learner since 2018
"To be able to take courses at my own pace and rhythm has been an amazing experience. I can learn whenever it fits my schedule and mood."
Jennifer J.
Learner since 2020
"I directly applied the concepts and skills I learned from my courses to an exciting new project at work."
Larry W.
Learner since 2021
"When I need courses on topics that my university doesn't offer, Coursera is one of the best places to go."
Chaitanya A.
"Learning isn't just about being better at your job: it's so much more than that. Coursera allows me to learn without limits."

New to Machine Learning? Start here.

Placeholder

Open new doors with Coursera Plus

Unlimited access to 7,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription

Advance your career with an online degree

Earn a degree from world-class universities - 100% online

Join over 3,400 global companies that choose Coursera for Business

Upskill your employees to excel in the digital economy

Frequently asked questions