I'm having a hard time understanding this article.
First of all, a quantum annealer is not a universal quantum computer, just to elucidate the title.
Then, it seems like they are comparing a simulation of p-computers to a physical realization of a quantum annealer (likely D-wave, but not named outright for some reason).
If this is true, it doesn't seem like a very relevant comparison, because D-wave systems actually exist, while their p-computer sounds like it is just a design.
But I may have misunderstood, because at times they make it sound like the p-computer actually exists.
Also, they talk about how p-computers can be scaled up with TSMC semiconductor technology. From what I know, this is also true for semiconductor-based (universal) quantum computers.
University press releases should not be posted on HN. a press release is just a published paper + PR spin. If the PR spin were true, it would be in the paper. Just link to the paper.
This makes me wonder: Would it be possible to implement an equivalent to Shor's algorithm on a p-computer. Maybe the quantumness isn't necessary at all
The power of quantum computing is constructing the solution to a problem out of an interference pattern. Classical probabilities don’t interfere, but quantum probabilities do. Loosely, quantum probabilities can be constructed to cancel, since their amplitudes can be negative.
Shor’s algorithm works on the quantum Fourier transform. The quantum Fourier transform works because you can pick a frequency out of a signal using a “test wave.” The test wave can select out the amplitude of interest because the information of the test wave constructively interferes, whereas every other frequency cancels. This is the interference effect that can only happen with complex/negative probability amplitudes.
That's a cool thought! For those who may not know, Shor's algorithm is fundamentally quantum because it relies on the interference of probability amplitudes, which can be both positive and negative. It could not be directly implemented on a p-computer because you could only simulate this interference, which removes the exponential advantage.
It's possible that an entirely different approach is made possible by p-computers, but this would be tricky to find. Furthermore, it seems that the main advantage of p-computers is sampling from a Boltzmann-like distribution, and I'm not aware that this is the bottleneck in any known factorisation algorithm.
A direct equivalent, no, as stated in the introduction.
"Notably, while probabilistic computers can emulate quantum interference with polynomial resources, their convergence is in general believed to require
exponential time [10]. This challenge is known as the signproblem in Monte Carlo algorithms [11]."
I doubt it. Shor's algorithm relies on the quantum Fourier transform, which requires the complex phase information encoded in the quantum wavefunctions. The quantum probability norm (L2) accounts for interference between the complex amplitudes of these wavefunctions; the classical L1 probability norm does not.
I'm not sure that it's just L1 vs L2, since the Wigner formulation of quantum mechanics uses real-valued quasi-probabilities, but ones which can take negative values.
Oh, and also, if you swap out h-bar in Wigner's equations with some wavelength \lambda, you can interpret it in terms of classical wave optics... somehow. I'm not sure.
The paper compares p-computers with D-Wave's quantum annealing machine, which is limited to only solving certain problems (as opposed to universal QC such as Google or IonQ's, that could in theory implement Shor's)
I'm confused. Do p-computers have any complexity theoretic advantage over classical computers, similar to how quantum computers have such an advantage in some areas? Or are they just normal computers in the end?
by complexity class that would be consensus, although the argument for building BPP systems is about the energy cost being orders of magnitude less and perhaps also some polynomial speedup
First of all, a quantum annealer is not a universal quantum computer, just to elucidate the title.
Then, it seems like they are comparing a simulation of p-computers to a physical realization of a quantum annealer (likely D-wave, but not named outright for some reason). If this is true, it doesn't seem like a very relevant comparison, because D-wave systems actually exist, while their p-computer sounds like it is just a design. But I may have misunderstood, because at times they make it sound like the p-computer actually exists.
Also, they talk about how p-computers can be scaled up with TSMC semiconductor technology. From what I know, this is also true for semiconductor-based (universal) quantum computers.
University press releases should not be posted on HN. a press release is just a published paper + PR spin. If the PR spin were true, it would be in the paper. Just link to the paper.
https://www.nature.com/articles/s41467-025-64235-y
Title: "Pushing the boundary of quantum advantage in hard combinatorial optimization with probabilistic computers"
Abstract: "Adaptive parallel tempering [...] scales more favorably and outperforms simulated quantum annealing"
HN title should be changed to match the paper title or abstract.
I'm not sure how this compares to quantum with its dozens to hundreds of qubits
This makes me wonder: Would it be possible to implement an equivalent to Shor's algorithm on a p-computer. Maybe the quantumness isn't necessary at all
Shor’s algorithm works on the quantum Fourier transform. The quantum Fourier transform works because you can pick a frequency out of a signal using a “test wave.” The test wave can select out the amplitude of interest because the information of the test wave constructively interferes, whereas every other frequency cancels. This is the interference effect that can only happen with complex/negative probability amplitudes.
It's possible that an entirely different approach is made possible by p-computers, but this would be tricky to find. Furthermore, it seems that the main advantage of p-computers is sampling from a Boltzmann-like distribution, and I'm not aware that this is the bottleneck in any known factorisation algorithm.
"Notably, while probabilistic computers can emulate quantum interference with polynomial resources, their convergence is in general believed to require exponential time [10]. This challenge is known as the signproblem in Monte Carlo algorithms [11]."
... of https://www.nature.com/articles/s41467-025-64235-y
- https://www.nature.com/articles/s41928-025-01439-6 (link text: "In one study")
- https://www.nature.com/articles/s41467-025-64235-y (link text: "In the most recent paper")
Oh, and also, if you swap out h-bar in Wigner's equations with some wavelength \lambda, you can interpret it in terms of classical wave optics... somehow. I'm not sure.