I have frequently heard criticisms of quantum computing advances that rely on the seemingly little progress in their ability to factor.
If quantum computers were such an imminent threat, why haven’t they factored any number larger than 15 yet?
This fellow points out that this may not be a good way to think about it:
It’s a very catchy critique indeed: wouldn’t we expect at least some progress? How much should we be concerned by these very expensive machines that are easily outperformed by an 8-bit Home Computer, an Abacus, and a Dog? It’s certainly funny, but the argument doesn’t hold water on closer inspection.
He says that a lot of the advances are advances in error correction, which are not necessarily visible in the size of numbers being factored. He says this is because quantum computing "correction has a significant baseline overhead."
The day a quantum computer beats a classical computer on factoring, heck the day it factors a 32-bit number, we’re uncomfortably close to Q-day already. So: don’t wait for quantum computer factoring records; you’ll be too late.
The article has a number of great links.
In the field, this is an entirely uncontroversial opinion, see eg:
- Craig Gidney’s Why haven’t quantum computers factored 21 yet?
- Slide 19 of Adam Zalcman’s RWPQC 2026 talk
- Sam Jaques’ PQCrypto 2025 talk
- Scott Aaronson compares it to expecting a small nuclear explosion.
From Sam Jaques' page:
I'm sorry....wHaT?!
I read the top suggested article last year when it was linked - in yet another article, and now that I read it again I find it actually a lot less unremarkable than I found it last year. The circuit size growth demonstration between 15 and 21 is enlightening.