Foundations
Reference explainers for the concepts that underpin modern ML systems. Each entry covers the math, the implementation, and the intuition — built to be bookmarked, not scrolled past.
Unlike blog posts, these are maintained over time as understanding deepens. Check the 'last updated' date on each.
Transformer Internals
A ground-up walkthrough of the transformer architecture — from the math of attention through positional encoding to a complete forward pass, with tested PyTorch implementations at every step.