This course covers the fundamentals and advanced applications of BERT and GPT models. You will learn how BERT processes text, including tokenization and vectorization, and practice fine-tuning BERT for tasks such as sequence classification, token classification, and question answering. The course also explains how GPT generates text, adapts to different writing styles, and can be fine-tuned for tasks like translating English to code. Additional topics include semantic search using Siamese BERT and multi-task learning with GPT through prompt engineering. By the end of the course, you will have the practical skills and theoretical understanding needed to apply BERT and GPT to various natural language processing problems.



Introduction to Transformer Models for NLP: Unit 2
This course is part of Introduction to Transformer Models for NLP Specialization

Instructor: Pearson
Access provided by National Research Nuclear University MEPhI
Recommended experience
What you'll learn
Master the architectures and core mechanisms of BERT and GPT for natural language understanding and generation.
Fine-tune pre-trained models for advanced NLP tasks such as classification, question answering, and semantic search.
Apply hands-on techniques to customize BERT and GPT for specific domains and writing styles.
Utilize prompt engineering and few-shot learning to solve multiple NLP tasks efficiently.
Skills you'll gain
Details to know

Add to your LinkedIn profile
6 assignments
August 2025
See how employees at top companies are mastering in-demand skills

Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate

There is 1 module in this course
This module provides a comprehensive exploration of modern transformer-based models for natural language processing. It covers the foundational architectures and mechanisms of BERT and GPT, delving into their pre-training, fine-tuning, and practical applications. Through hands-on lessons, learners engage with real-world tasks such as sequence and token classification, question answering, semantic search, and text generation. The module emphasizes both theoretical understanding and practical skills, enabling students to leverage BERT and GPT for a wide range of NLP challenges, including multi-task learning and adapting models to new domains or writing styles.
What's included
24 videos6 assignments
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV. Share it on social media and in your performance review.
Why people choose Coursera for their career







