About Course

This hands-on course is designed to demystify the inner workings of Large Language Models (LLMs) by guiding you through building one from the ground up — without relying on HuggingFace or pre-trained models. Over 2 weeks, you’ll implement every core component using Python, NumPy, and PyTorch (or JAX), gaining a deep understanding of tokenization, attention mechanisms, training loops, and inference logic. By the end, you’ll have your very own TinyGPT model that you can train, generate text with, and deploy!

What Will You Learn?

  • Core architecture of transformer-based LLMs
  • How to build a tokenizer from scratch (character or BPE)
  • Manual implementation of attention, positional encoding, and residual layers
  • Training loop construction using PyTorch or NumPy
  • Text generation using sampling and greedy decoding methods
  • Deployment of your LLM as a command-line or web app

Course Content

Day 1 – LLM Foundations & Math Prerequisites

  • What is an LLM and How Does It Work?
  • Mathematical Essentials for Transformers

Day 2 – Tokenization and Vocabulary Building

Day 3 – Attention Mechanism from Scratch

Day 4 – Multi-Head Attention & Positional Encoding

Day 5 – Build a Mini Transformer Block

Day 6 – Create Training Data & Setup Training Loop

Day 7 – Train Your First LLM (TinyGPT)

Day 8 – Add Masking & Padding for Efficiency

Day 9 – Model Optimization + Parameter Scaling

Day 10 – Capstone: Build & Showcase a Working Mini-LLM

Available in:
E
₹10,000.00

Material Includes

  • tokenizer.py – Custom tokenizer (character or BPE-based)
  • transformer_block.py – Transformer decoder block from scratch
  • train.py, generate.py – Complete training and generation scripts
  • Trained TinyGPT model – A minimal working LLM
  • CLI or Gradio-based demo app for text generation
  • Clean, well-documented codebase for further development

Requirements

  • Strong Python fundamentals (data structures, control flow, functions)
  • Understanding of basic linear algebra (dot products, matrices, softmax)
  • Familiarity with NumPy and either PyTorch or JAX
  • Laptop with Python 3.8+ and preferred editor (VS Code or Jupyter)
  • Pre-installed packages: numpy, torch, matplotlib, gradio
  • Reliable internet connection and access to Zoom for live sessions

Share

Audience

  • Developers with strong Python skills and an interest in AI
  • ML and NLP enthusiasts seeking to understand transformer internals
  • Engineers who want to move beyond frameworks like HuggingFace
  • Students or researchers aiming to deepen their LLM expertise
  • Professionals exploring how ChatGPT-like models work under the hood

Want to receive push notifications for all major on-site activities?