Quantum-inspired algorithm could enable better weather forecasts

- Advertisement -


Quantum-inspired algorithm could enable better weather forecasts

Simulating turbulent air flow accurately is vital for weather forecasts

EUMETSAT/ESA

Quantum-inspired algorithms can simulate turbulent fluid flows on a classical computer much faster than existing tools, slashing computation times from several days on a large supercomputer to just hours on a regular laptop. This could improve weather forecasts and increase the efficiency of industrial processes, say researchers.

Turbulence in liquid or air involves numerous interacting eddies that quickly become so chaotically complex that precise simulation is impossible for even the most powerful computers. Quantum counterparts promise to improve matters, but currently even the most advanced machines are incapable of anything but rudimentary demonstrations.

Read more

Quantum computers teleport and store energy harvested from empty space

These turbulence simulations can be simplified by replacing precise calculations with probabilities. But even this approximation leaves scientists with computations that are infeasibly demanding to solve.

Nikita Gourianov at the University of Oxford and his colleagues have now developed a new approach that uses quantum computer-inspired algorithms called tensor networks to represent turbulence probability distributions.

Tensor networks originated in physics and came into common use in the early 2000s. They now offer a promising path to eke out much more performance from existing classical computers before truly useful quantum machines are available.

Sign up to our The Daily newsletter

The latest science news delivered to your inbox, every day.

Sign up to newsletter

“The algorithms and the way of thinking comes from the world of quantum simulation, and these algorithms are very close to what quantum computers do,” says Gourianov. “We’re seeing quite a drastic speed-up, both in theory and in practice.”

In just a few hours, the team was able to run a simulation on a laptop that previously took several days on a supercomputer. The new algorithm saw a 1000-fold reduction in processor demand, and a million-fold reduction in memory demand. While this simulation was just a simple test, the same types of problem on a larger scale lie behind weather forecasts, aerodynamic analysis of aircraft and analysis of industrial chemical processes.

The turbulence problem, which has data in five dimensions, gets extremely difficult without using tensors, says Gunnar Möller at the University of Kent, UK. “Computationally, it’s a nightmare,” he says. “You could maybe do it in limited cases, when you have a supercomputer and are happy to run it for a month or two.”

Tensor networks work by, in effect, reducing the amount of data a simulation requires, drastically cutting the computational power required to run it. The amount and nature of the data removed can be carefully controlled by dialling the level of precision up or down.

Read more

Google breakthrough paves way for large-scale quantum computers

These mathematical tools have already been used in the cat-and-mouse game between quantum computer developers and classical computer scientists. Google announced in 2019 that a quantum processor called Sycamore had achieved “quantum supremacy” – the point at which a quantum computer can complete a task that would be, for all intents and purposes, impossible for ordinary computers.

However, tensor networks simulating the same problem on large clusters of conventional graphics processing units later achieved the same thing in just over 14 seconds, undermining Google’s previous claim. Google has since pulled ahead once more with its new Willow quantum machine.

Large and fault-tolerant quantum computers, once they are created, will be able to run tensors on much larger scales with much greater precision than classical computers, but Möller says he is excited by what might be achieved in the meantime.

“With a laptop, the authors of this paper could beat what’s possible on a supercomputer, just because they have a smarter algorithm,” he says. “If you use this algorithm on a supercomputer, you may go way further than you could using any direct computational approach. It immediately has a tremendous benefit, and I don’t have to wait another 10 years to have the perfect quantum computer.”

Journal reference:

Science Advances DOI: 10.1126/sciadv.ads5990

Topics:

  • quantum computing
- Advertisement -

Latest articles

Related articles

error: Content is protected !!