This course covers the development of natural language processing (NLP), starting with basic concepts and moving to modern transformer architectures. You will learn about attention mechanisms and their impact on language modeling, as well as the details of transformer models, including scaled dot product attention and multi-headed attention. The course includes practical exercises in transfer learning using pre-trained models such as BERT and GPT, with instruction on fine-tuning these models for specific NLP tasks in PyTorch. By the end, you will understand the theory behind current NLP models and gain practical experience in applying them to real-world problems.



Introduction to Transformer Models for NLP: Unit 1
This course is part of Introduction to Transformer Models for NLP Specialization

Instructor: Pearson
Access provided by Universidad Sergio Arboleda
Recommended experience
What you'll learn
Understand the evolution of NLP architectures and the transformative impact of attention mechanisms.
Analyze the structure and mathematical foundations of transformer models, including scaled dot product and multi-headed attention.
Apply transfer learning techniques using pre-trained language models such as BERT and GPT.
Gain practical experience with PyTorch to fine-tune NLP models for custom tasks.
Skills you'll gain
Details to know

Add to your LinkedIn profile
3 assignments
August 2025
See how employees at top companies are mastering in-demand skills

Build your subject-matter expertise
- Learn new concepts from industry experts
- Gain a foundational understanding of a subject or tool
- Develop job-relevant skills with hands-on projects
- Earn a shareable career certificate

There is 1 module in this course
This module explores the evolution of natural language processing (NLP) through the development and application of attention mechanisms and transformer architectures. Beginning with the history and foundational concepts of attention in language models, it delves into the transformative impact of transformers and their unique attention mechanisms. The module concludes with practical instruction on transfer learning, demonstrating how to fine-tune state-of-the-art pre-trained models like BERT and GPT using PyTorch to achieve advanced NLP results.
What's included
14 videos3 assignments
Earn a career certificate
Add this credential to your LinkedIn profile, resume, or CV. Share it on social media and in your performance review.
Why people choose Coursera for their career







