The Register

Quantum code breaking? You’d get further with an 8-bit computer, an abacus, and a dog

The US National Institute for Standards and Technology (NIST) has been pushing for the development of post-quantum cryptographic algorithms since 2016.

“If large-scale quantum computers are ever built, they will be able to break many of the public-key cryptosystems currently in use,” NIST explains in its summary of Post-Quantum Cryptography (PQC).

Peter Gutmann, a professor of computer science at the University of Auckland New Zealand, thinks PQC is bollocks – “nonsense” for our American readers – and said as much in a 2024 presentation [PDF], “Why Quantum Cryptanalysis is Bollocks.”

Gutmann’s argument is simple: to this day, quantum computers – which he regards as “physics experiments” rather than pending products – haven’t managed to factor any number greater than 21 without cheating.

Quantum computers and PQC are both enormously complex. But the common process for cracking RSA public key encryption is relatively straightforward. Given a key pair modulus n, you have to factor it into two prime numbers, p and q, such that n = p * q.

When n is 21, p could be 3 and q could be 7, for a not very secure 5-bit RSA implementation. But when n is a 1,024-bit or 2,048-bit number, finding two prime factors requires a tremendous amount of computational power.

NIST’s concern, raised by many computer scientists, is that a quantum computer might one day be capable of running Shor’s algorithm, which is essentially a shortcut to find the prime factors of a large integer. Were that to happen, data protected by insufficiently complex encryption keys would be at risk of exposure. To avoid that purported possibility, the US standards organization has been shepherding the development of various quantum-resistant encryption algorithms, such as HQC (Hamming Quasi-Cyclic), CRYSTALS-Kyber, CRYSTALS-Dilithium, Sphincs+, and FALCON. These are being positioned as replacements for current algorithms like RSA.

NIST, which did not immediately have someone available for comment, says some engineers claim quantum code-cracking could be a thing within two decades. Since that’s about as long as the deployment of modern public key infrastructure has taken, the standards body argues it’s time to start preparing.

Judging by the present state of quantum computers, the case for new algorithms looks less compelling. In a tongue-in-cheek paper [PDF] released in March, Gutmann and co-author Stephan Neuhaus, lecturer of computer science at HS Bund in Brühl, Germany, argue that it’s possible to replicate the code-cracking capabilities of current quantum computers with a VIC-20 8-bit home computer from 1981, an abacus, and a dog.

PQC…isn’t mathematics or engineering, it’s augury: ‘A great machine shall arise, and it will cast aside all existing cryptography, there shall be Famine, Plague, War, and a long arable field.’

The paper notes that IBM in 2001 implemented Shor’s algorithm in a seven-qubit quantum computer, demonstrating the factorization of the number 15. A decade later, researchers managed to use a quantum computer to factor the number 21. IBM tried to factor 35 in 2019 [PDF] but basically failed – the algorithm worked 14 percent of the time due to rampant qubit errors.

Researchers affiliated with Shanghai University claim to have used a quantum computer from D-Wave, which specializes in quantum annealing computers tuned for specific optimization problems, to have factored a 2,048-bit RSA integer.

But according to Gutmann and Neuhaus, the RSA number evaluated was the product of two prime factors that were too close together.

As with a parlor magician’s card deck that’s been stacked for a card trick, the computer scientists explain in their paper: “Quantum factorization is performed using sleight-of-hand numbers that have been selected to make them very easy to factorize using a physics experiment and, by extension, a VIC-20, an abacus, and a dog.”

“Since n (public key) = p * q, the square root of n will give you p and q to within one or two bits if there’s only one or two bits difference between them,” Gutmann told The Register. “This is why standards for RSA, like FIPS 186 which the paper references, require that they differ by at least 100 bits, i.e. that they’re 2^100 (1.3 x 10^30, which Google tells me is called a nonillion) or more apart so you can’t get an approximation to them using a square root operation.”

An analog in the AI world would be touting the benchmark testing prowess of an AI model trained on the questions in benchmark tests.

Trevor Lanting, chief development officer at D-Wave, told The Register: “Based on our assessment, this research does not represent a new fundamental breakthrough in capability, it’s an exploration of some previous work in using annealing QC to factor small numbers. The research explores factoring capability, which we’ve long said is a problem set that both annealing and gate model quantum systems could address.

“Breaking modern encryption would require quantum processors many orders of magnitude larger than today’s scale: there will be no threat to encryption for many years. Moreover, there are post-quantum encryption protocols available. D-Wave does not specifically focus on cryptography, but our technology has been used to power intrusion and threat detection applications.”

Amid claims of “quantum supremacy” by Google, Microsoft’s disputed Majorana breakthrough, University of Illinois computer science professor Daniel Bernstein’s arguments that quantum computers shouldn’t be written off, and University of Texas computer scientist Scott Aaronson’s view that quantum computing “is on the threshold of becoming real,” Gutmann remains skeptical that anyone will be doing any meaningful code cracking with “physics experiments” any time soon.

The Register asked Gutmann to elaborate on when the implausibility of quantum cryptanalysis became apparent.

“There wasn’t really any specific time, although it has become more and more obvious over time, with the failure of any genuine (non-sleight-of-hand) quantum cryptanalysis to appear, that it’s about as real as fusion-powered electricity too cheap to meter and all the other decades-old tech pipe dreams,” Gutmann explained.

“I’m an empirical gnostic, and with standard crypto that works well, it’s based on mathematics and engineering, we can look at the math and look at the engineering (computing power and so on) and draw a line through the data points on a graph and say ‘this will be good until about this time.’

“PQC on the other hand isn’t mathematics or engineering, it’s augury: ‘A great machine shall arise, and it will cast aside all existing cryptography, there shall be Famine, Plague, War, and a long arable field.’

“The Bollocks talk drew a line through the two PQC data points we have which indicate that we’d get to the same level of code breaking that we have today with standard computers in about 2,000 years’ time, but even those data points are from sleight-of-hand factorizations, not legitimate applications of Shor’s algorithm to recover two unknown factors as needed to break RSA (this is why in the paper we suggest evaluation criteria for quantum cryptanalysis claims that should be resistant to sleight-of-hand tricks).

“In practice we have zero data points, which is pretty good evidence that we’re not getting anywhere with physics experiment-based cryptanalysis.”

Gutmann added as an aside that we also have the same number of data points for faster-than-light space travel, Star Trek-style transporters, and any number of other high-tech dreams.

Asked whether his skepticism of PQC extends to quantum computers in general, Gutmann said The Australian Strategic Policy Institute addressed the matter better than he could:

We inquired how IT security professionals should interpret the move to “Post-Quantum Cryptography” and whether there’s anything to be gained from the transition.

“No, in fact there’s a lot to be lost,” Gutmann replied. “We currently have a multibillion-dollar global cybercrime industry that’s built on the failure of encryption to provide the protection that it’s supposed to, and instead of fixing that problem we’re investing a vast amount of effort into swapping out our crypto for new stuff that’s inefficient and difficult to work with and that offers no more protection than the old stuff (there’s a reason why it had been ignored for decades before quantum cryptanalysis came along to give it a reason to exist, it’s really not very practical or usable). Switching to PQC is just a distraction from having to fix the actual hard problem.” ®

READ MORE HERE