Somewhat related is the work done in "Falling with Style: Factoring up to 255 “with” a Quantum Computer" published in the proceedings of SIGBOVIK 2025 [1]. The author, Craig Gidney [2], successfully factored all odd composite numbers up to 255 using Shor's algorithm, even though the quantum circuits involved were such that any meaningful output would be overwhelmed by noise (and indeed, performance was maintained when the circuits were replaced by a random number generator).
> To my knowledge, no one has cheated at factoring in this way before. Given the shenanigans pulled by past factoring experiments, that’s remarkable.
[2] Who has previous experience in cheating at quantum factoring: see "Factoring the largest number ever with a quantum computer", posted April Fools' Day 2020 at https://algassert.com/post/2000
"Just as a thought experiment, what's the most gutless device that could
perform this "factorisation"? There's an isqrt() implementation that uses
three temporaries so you could possibly do the square root part on a ZX81, but
with 1k of RAM I don't think you can do the verification of the guess unless
you can maybe swap the values out to tape and load new code for the multiply
part. A VIC20 with 4k RAM should be able to do it... is there a programmable
calculator that does arbitrary-precision maths? A quick google just turns up
a lot of apps that do it but not much on physical devices.
You can verify in limited memory by repeatedly verifying modulo a few small integers. If that works, then by Chinese remainder theorem the main result also holds.
Yeah there's a reason that the quantum computing field has moved away from attempting factorisations. Not that there's not still hype and misleading claims being punished, but the hardware has improved a ton since 2001 and ever closer to actual useful quantum computation (such as large size quantum chemistry calculations).
Are those useful computations in the room with us right now? No, but seriously, I feel like factorization is the one application that could justify those massive investments QC is receiving (even though it would probably make the world strictly worse...).
All those other applications, no matter how neat, I feel are quite niche. Like, "simulate pairs of electrons in the Ising model". Cool. Is that a multi-billion dollars industry though?
Ground state and activation energy estimation for chemistry would be really useful. I know chemists are looking specifically at nitrogen fixation as one useful example.
Or as another example, I'm currently at a conference listening to a PhD student's research on biomolecular structure prediction (for protein design).
If results from methods with higher electronic structure accuracy than DFT (MP2, couple cluster) can be made cheap enough, it would hugely disrupt industrial chemistry, medical experimentation, pharmaceuticals, the energy sector, etc.
> In the Callas Normal Form, the factors are integers p = 2^{n-1}
and q = 2^{m+1}, where n ≤ m, and p and q are ideally prime, but don’t have to be.
The paper's formatting clearly went wrong here, as it should have read p = 2^n - 1 and q = 2^m + 1.
The "Proposed Quantum Factorisation Evaluation Criteria" are excellent, but for measuring progress, the required minimum factor size of 64 bits is too large. A good milestone would be a quantum circuit that can factor the product of any pair of 5-bit primes {17,19,23,29,31}.
> To my knowledge, no one has cheated at factoring in this way before. Given the shenanigans pulled by past factoring experiments, that’s remarkable.
[1] https://sigbovik.org/2025/; standalone paper is also available in the code repository https://github.com/strilanc/falling-with-style
[2] Who has previous experience in cheating at quantum factoring: see "Factoring the largest number ever with a quantum computer", posted April Fools' Day 2020 at https://algassert.com/post/2000
I really hope he eventually gets the recognition he deserves, outside of just experts in the field.
It starts here: https://www.metzdowd.com/pipermail/cryptography/2025-Februar...
This part is from farther down thread:
"Just as a thought experiment, what's the most gutless device that could perform this "factorisation"? There's an isqrt() implementation that uses three temporaries so you could possibly do the square root part on a ZX81, but with 1k of RAM I don't think you can do the verification of the guess unless you can maybe swap the values out to tape and load new code for the multiply part. A VIC20 with 4k RAM should be able to do it... is there a programmable calculator that does arbitrary-precision maths? A quick google just turns up a lot of apps that do it but not much on physical devices.
Peter."
All those other applications, no matter how neat, I feel are quite niche. Like, "simulate pairs of electrons in the Ising model". Cool. Is that a multi-billion dollars industry though?
Or as another example, I'm currently at a conference listening to a PhD student's research on biomolecular structure prediction (for protein design).
Its a device that makes and analyzes at the same time, check out this primer:
https://warwick.ac.uk/fac/sci/chemistry/research/oconnor/oco...
The paper's formatting clearly went wrong here, as it should have read p = 2^n - 1 and q = 2^m + 1.
The "Proposed Quantum Factorisation Evaluation Criteria" are excellent, but for measuring progress, the required minimum factor size of 64 bits is too large. A good milestone would be a quantum circuit that can factor the product of any pair of 5-bit primes {17,19,23,29,31}.
The dog is funny but it just means, pick actually "random" numbers from a bigger range than the staged phony numbers quantum factorisation uses.
Brilliant.