This is a brief guide to my new art project microgpt, a single file of 200 lines of pure Python with no dependencies that trains and inferences a GPT. This file contains the full algorithmic content of what is needed: dataset of documents, tokenizer, autograd engine, a GPT-2-like neural network architecture, the Adam optimizer, training loop, and inference loop. Everything else is just efficiency. I cannot simplify this any further. This script is the culmination of multiple projects (micrograd, makemore, nanogpt, etc.) and a decade-long obsession to simplify LLMs to their bare essentials, and I think it is beautiful 🥹. It even breaks perfectly across 3 columns:
Where to find it:
- This GitHub gist has the full source code: python microgpt
- It’s also available on this web page: https://karpathy.ai/microgpt.html
- Also available as a Google Colab notebook
The following is my guide on stepping an interested reader through the code.
...read more at karpathy.github.io
pull down to refresh
related posts