When Nic Carter had his freak out about quantum advances earlier this week, he mentioned two papers: one of them was by Google (#1462657) and the other was by a group of researchers at CalTech.
Here is a thread explaining the CalTech paper and how it is different from the paper published by Google:
The thread is by a guy named Alex Pruden who is the CEO of the quantum readiness company ProjectEleven in which Carter has invested. They both certainly have an incentive to make quantum computers seem like a significant threat to Bitcoin, so you might read this as a worst case scenario view of current developments in quantum computing.
First thing to note, the team is S-Tier. Founded out of CalTech (where Richard Feynman first proposed quantum computing) paper co-authors include @DolevBluvstein, the author of several other papers demonstrating the viability of neutral atom platforms; as well as John Preskill (@preskill), who many consider the godfather of quantum error correction.Next, the headline: Shor's algorithm on ECDSA is possible with only 10k neutral atom qubits. This is by far the lowest of the resource estimates, 100x lower than @CraigGidney 2025 estimate, and 10x lower than the Pinnacle estimate.
- several other physics & QC heavy hitters.
To give a sense of relative scale, the CalTech team has already demonstrated an array of 6100 entangled atoms, effectively "proto-qubits".
To clarify, the paper describes a range of qubit counts that represent a time-space trade-off. The 10k count is for a space-efficient circuit that takes over a year to run (assuming 1ms clockspeed); but a time efficient variant using only 22k qubits could run in 9 days...well within the window of making "at-rest" public key cryptography obsolete.
Many skeptics have responded to the Google paper by pointing out that it doesn't really represent an advance in quantum computers; that the result mainly theoretical. Indeed, the Google paper is (perhaps intentionally) a bit hand-wavy on details.
The @TeamOratomic paper is much more specific in terms of architecture. The key assumptions and areas requiring future engineering and scientific effort are called out explicitly.
In particular and with regard to feasibility of the architecture described, many components of the overall system have been demonstrated individually in other papers, includingThe most impactful innovation was applying a new class of high-rate error-correcting codes (specifically, quasi-cyclic lifted product LP codes) that are massively more efficient than previously from previously demonstrated surface codes.
- continuous operation and reloading qubits
- lattice surgery
- magic state cultivation
Neutral atom platforms are ideal for applying these higher-rate codes, due to the dynamic reconfigurability of the system that can enable "all-all" connectivity between qubits in different parts of the systems.
Traditional bottlenecks for neutral atoms (such as the long cycle/slow measurement times) are addressed by proposals based on other recently published research to make the atomic qubits constant velocity, rastering to the control lasers, and leveraging recent, faster techniques to reduce measurement times from ~1 ms to 1 us (1000x)
That said considerable challenges in realizing this system remain. For example, real time decoding of the new LP codes is explicitly called out as a system bottleneck. In addition, lattice surgery on large code patches is computationally infeasible: motivating a split between quantum "memory" and "processor" parts of the system (that in turn adds some complexity)
So again, this paper does not mean that a cryptographically relevant quantum computer (CRQC) exists today, or that digital assets like Bitcoin are under immediate threat.
It's also important to note that even with the proposed optimizations, this machine would fall under the "slow-clock" regime as categorized in the Google paper, and this would threaten "at-rest" assets versus enable on-spend attacks.
But it does nonetheless underscore the conclusion of the Google paper: the need for blockchains (and all systems relying on classical public key cryptography) to begin migrating to post-quantum crypto as soon as possible.
Even though Google's paper hit hard with the revelation that on-spend attacks on Bitcoin might be possible, my money would still be on neutral atom machines (like @TeamOratomic's) to be first past the post in the race for utility scale quantum computing (& cryptographic relevance)
Here is a link to the CalTech paper: https://arxiv.org/abs/2603.28627
Scott Aaronson also discusses it some in his recent blog post (#1463651).
https://twiiit.com/apruden08/status/2039706233960271952