Back to Generative Pre-trained Transformers (GPT)
University of Glasgow

Generative Pre-trained Transformers (GPT)

Large language models such as GPT-3.5, which powers ChatGPT, are changing how humans interact with computers and how computers can process text. This course will introduce the fundamental ideas of natural language processing and language modelling that underpin these large language models. We will explore the basics of how language models work, and the specifics of how newer neural-based approaches are built. We will examine the key innovations that have enabled Transformer-based large language models to become dominant in solving various language tasks. Finally, we will examine the challenges in applying these large language models to various problems including the ethical problems involved in their construction and use. Through hands-on labs, we will learn about the building blocks of Transformers and apply them for generating new text. These Python exercises step you through the process of applying a smaller language model and understanding how it can be evaluated and applied to various problems. Regular practice quizzes will help reinforce the knowledge and prepare you for the graded assessments.

Status: LLM Application
Status: Deep Learning
IntermediateCourse13 hours

Featured reviews

RH

5.0Reviewed Jan 20, 2024

I liked the course, It was informative with a little of coding assignments. The coding assignments could be a bit more in depth.

CP

5.0Reviewed Feb 28, 2024

Great overview of GPT with some labs and very recent information. Deep Learning training is recommended.

All reviews

Showing: 9 of 9

christophe poujol
5.0
Reviewed Feb 29, 2024
Joris Coddé
4.0
Reviewed Dec 21, 2023
Anders Lyhne Christensen
4.0
Reviewed Oct 5, 2023
Milad Afshari
1.0
Reviewed Nov 10, 2023
Daniel Medeiros Rocha
5.0
Reviewed May 14, 2024
Roel Heremans
5.0
Reviewed Jan 21, 2024
Sabri Dehar
5.0
Reviewed Dec 3, 2023
marco sera
4.0
Reviewed Mar 9, 2024
John DeSanto
2.0
Reviewed Oct 28, 2024