
About Course
This hands-on course is designed to demystify the inner workings of Large Language Models (LLMs) by guiding you through building one from the ground up — without relying on HuggingFace or pre-trained models. Over 2 weeks, you’ll implement every core component using Python, NumPy, and PyTorch (or JAX), gaining a deep understanding of tokenization, attention mechanisms, training loops, and inference logic. By the end, you’ll have your very own TinyGPT model that you can train, generate text with, and deploy!
Course Content
Day 1 – LLM Foundations & Math Prerequisites
-
What is an LLM and How Does It Work?
-
Mathematical Essentials for Transformers