Quantum tunneling could drive random DNA mutations, says new study

DNA is known to mutate regularly, for better or worse, driving both evolution and disease. Researchers at the University of Surrey have now found evidence that some of these spontaneous mutations could be caused by the spooky realm of quantum mechanics… Continue reading Quantum tunneling could drive random DNA mutations, says new study

One-way superconducting diode has massive implications for electronics

A TU Delft team has demonstrated a one-way superconductor that gives zero resistance in one direction, but blocks current completely in the other. The discovery, long thought impossible, heralds a 400x leap in computing speed and huge energy savings.Co… Continue reading One-way superconducting diode has massive implications for electronics

Can you really ignore number of quantum processing steps needed for Shor’s algorithm? [migrated]

Answers to question RSA key length vs. Shor’s algorithm suggest that e.g. 2048 bit RSA encryption would be trivially broken with 4099 qubit quantum computer using Shor’s algorithm (best known implementation of the algorithm requiring 2n+3 qubits).

Is this really true? If I’ve understood correctly, the number of gates (logically quantum operations) needed would be around log(2^2048)^2×log(log(2^2048))×log(log(log(2^2048))) which is roughly 2.9×10⁷. Considering that not even classical computers can execute any operations with 2.9×10⁷ gates using single piece of input data it really doesn’t make sense to assume that such a high number of gates could be operated by quantum computer in non-trivial time.

I would assume that for quantum computer to execute one step executing the Shor’s algorithm would need to pass (logically) one input through all those gates which would be analogous to classical computer executing enough computer code to pass one 2048 bit input through 2.9×10⁷ gates. Because information cannot travel faster than speed of light and gates have non-zero dimensions, this cannot happen in trivial time. And if you use photons for qubits in the quantum computer, wavelength probably sets minimum dimensions for a gate regardless of manufacturing abilities.

And if you need any error correction between the gates, that will require extra space and hence increase latency, too.

In addition, if I’ve understood correctly, to actually factor big numbers with Shor’s algorithm you need to use classical computer to generate a random guess and Shor’s algorithm will then use that guess to maybe emit the data need to compute the factors. How many guesses on average you would actually need for factoring numbers used in 2048 bit RSA?

Has there been research about potential practical runtime of a big physical quantum computer trying to execute Shor’s algorithm for factoring big numbers? Does that really support the interpretation that you can simply ignore the processing time regardless of the size of the numbers?

Continue reading Can you really ignore number of quantum processing steps needed for Shor’s algorithm? [migrated]

Scientists calculate absolute quantum speed limit for electronics

It often feels like electronics will continue to get faster forever, but at some point the laws of physics will intervene to put a stop to that. Now scientists have calculated the ultimate speed limit – the point at which quantum mechanics prevents mic… Continue reading Scientists calculate absolute quantum speed limit for electronics

Record-setting hybrid atom array could power quantum computer RAM and CPU

Researchers at the University of Chicago have demonstrated a key technology that could help scale up quantum computers, and used it to create a model with a record-breaking 512 qubits. The team combined atoms of two elements into an array, so that one … Continue reading Record-setting hybrid atom array could power quantum computer RAM and CPU

Physicists measure gravitational time warp to within one millimeter

The flow of time isn’t as consistent as we might think – gravity slows it down, so clocks on the surface of Earth tick slower than those in space. Now researchers have measured time passing at different speeds across just one millimeter, the smallest d… Continue reading Physicists measure gravitational time warp to within one millimeter

Quasiparticles used to generate millions of truly random numbers a second

Random numbers are crucial for computing, but our current algorithms aren’t truly random. Researchers at Brown University have now found a way to tap into the fluctuations of quasiparticles to generate millions of truly random numbers per second.Contin… Continue reading Quasiparticles used to generate millions of truly random numbers a second

Silicon quantum computing surpasses 99% accuracy in three studies

Three teams of scientists from around the world have achieved a major milestone in quantum computing. All three groups demonstrated better than 99 percent accuracy in silicon-based quantum devices, paving the way for practical, scalable quantum compute… Continue reading Silicon quantum computing surpasses 99% accuracy in three studies

Proof of concept verifies physics that could enable quantum batteries

Quantum batteries could one day revolutionize energy storage through what seems like a paradox – the bigger the battery, the faster it charges. For the first time, a team of scientists has now demonstrated the quantum mechanical principle of superabsor… Continue reading Proof of concept verifies physics that could enable quantum batteries

“Quantum tornadoes” mark crossover from classical to quantum physics

The universe is governed by two sets of seemingly incompatible laws of physics – there’s the classical physics we’re used to on our scale, and the spooky world of quantum physics on the atomic scale. MIT physicists have now observed the moment atoms sw… Continue reading “Quantum tornadoes” mark crossover from classical to quantum physics