About

This site hosts line-by-line annotations of research papers I've found worth understanding deeply. Each post interleaves the paper's prose, the relevant math, and a faithful minimal reimplementation of the algorithm — in the tradition of The Annotated Transformer.

The code blocks aren't just pseudocode — they come from real, runnable reimplementations I've written specifically for pedagogical clarity. Variable names match the paper's notation, comments cite the supplement's algorithm numbers inline, and the whole thing has unit and integration tests. For the AlphaFold2 annotation, that code lives in minAlphaFold2 — ~8,000 lines of PyTorch, ~95% of the paper implemented.

Each post pins the referenced repo to a specific commit, so the code you read here is the exact code the site builds against. You can clone the repo, follow along, and re-run every example locally.

Chris Hayduk