pull down to refresh

This is a great read.

Conclusion

It's really remarkable how you can distill so many decades of progress in machine learning to just a few thousand bytes. There is essentially nothing missing here from everything you need to run any state-of-the-art neural network (except for the actual model weights). While I mostly put this together for fun, it's a nice demonstration how simple neural networks actually are.