pull down to refresh

I have frequently heard criticisms of quantum computing advances that rely on the seemingly little progress in their ability to factor.

If quantum computers were such an imminent threat, why haven’t they factored any number larger than 15 yet?

This fellow points out that this may not be a good way to think about it:

It’s a very catchy critique indeed: wouldn’t we expect at least some progress? How much should we be concerned by these very expensive machines that are easily outperformed by an 8-bit Home Computer, an Abacus, and a Dog? It’s certainly funny, but the argument doesn’t hold water on closer inspection.

He says that a lot of the advances are advances in error correction, which are not necessarily visible in the size of numbers being factored. He says this is because quantum computing "correction has a significant baseline overhead."

The day a quantum computer beats a classical computer on factoring, heck the day it factors a 32-bit number, we’re uncomfortably close to Q-day already. So: don’t wait for quantum computer factoring records; you’ll be too late.

The article has a number of great links.

In the field, this is an entirely uncontroversial opinion, see eg:
Below is an extrapolation of Sam Jaques’ extrapolation (of that famous graph) of Gidney’s algorithm attacking RSA-2048

From Sam Jaques' page:

When I reached out to Craig to ask if it could be extrapolated, he said "I would be wary of a simple extrapolation [...] There's probably a simple rule but I wouldn't trust it until simulations confirmed it." I ignored this advice and invented a simple rule just to make a nice chart

I'm sorry....wHaT?!

reply

I read the top suggested article last year when it was linked - in yet another article, and now that I read it again I find it actually a lot less unremarkable than I found it last year. The circuit size growth demonstration between 15 and 21 is enlightening.

reply