Incorporate sequence order without recurrence.
Day 1 – LLM Foundations & Math Prerequisites
0/2
Day 2 – Tokenization and Vocabulary Building
0/3
Day 3 – Attention Mechanism from Scratch
0/2
Day 4 – Multi-Head Attention & Positional Encoding
0/3
Day 5 – Build a Mini Transformer Block
0/3
Day 6 – Create Training Data & Setup Training Loop
0/3
Day 7 – Train Your First LLM (TinyGPT)
0/3
Day 8 – Add Masking & Padding for Efficiency
0/2
Day 9 – Model Optimization + Parameter Scaling
0/2
Day 10 – Capstone: Build & Showcase a Working Mini-LLM
0/3