Can you really ignore number of quantum processing steps needed for Shor’s algorithm? [migrated]

Answers to question RSA key length vs. Shor’s algorithm suggest that e.g. 2048 bit RSA encryption would be trivially broken with 4099 qubit quantum computer using Shor’s algorithm (best known implementation of the algorithm requiring 2n+3 qubits).

Is this really true? If I’ve understood correctly, the number of gates (logically quantum operations) needed would be around log(2^2048)^2×log(log(2^2048))×log(log(log(2^2048))) which is roughly 2.9×10⁷. Considering that not even classical computers can execute any operations with 2.9×10⁷ gates using single piece of input data it really doesn’t make sense to assume that such a high number of gates could be operated by quantum computer in non-trivial time.

I would assume that for quantum computer to execute one step executing the Shor’s algorithm would need to pass (logically) one input through all those gates which would be analogous to classical computer executing enough computer code to pass one 2048 bit input through 2.9×10⁷ gates. Because information cannot travel faster than speed of light and gates have non-zero dimensions, this cannot happen in trivial time. And if you use photons for qubits in the quantum computer, wavelength probably sets minimum dimensions for a gate regardless of manufacturing abilities.

And if you need any error correction between the gates, that will require extra space and hence increase latency, too.

In addition, if I’ve understood correctly, to actually factor big numbers with Shor’s algorithm you need to use classical computer to generate a random guess and Shor’s algorithm will then use that guess to maybe emit the data need to compute the factors. How many guesses on average you would actually need for factoring numbers used in 2048 bit RSA?

Has there been research about potential practical runtime of a big physical quantum computer trying to execute Shor’s algorithm for factoring big numbers? Does that really support the interpretation that you can simply ignore the processing time regardless of the size of the numbers?

Continue reading Can you really ignore number of quantum processing steps needed for Shor’s algorithm? [migrated]

Breaking RSA through Insufficiently Random Primes

Basically, the SafeZone library doesn’t sufficiently randomize the two prime numbers it used to generate RSA keys. They’re too close to each other, which makes them vulnerable to recovery.

There aren’t many weak keys out there, but there are some:

So far, Böck has identified only a handful of keys in the wild that are vulnerable to the factorization attack. Some of the keys are from printers from two manufacturers, Canon and Fujifilm (originally branded as Fuji Xerox). Printer users can use the keys to generate a Certificate Signing Request. The creation date for the all the weak keys was 2020 or later. The weak Canon keys are tracked as CVE-2022-26351…

Continue reading Breaking RSA through Insufficiently Random Primes