Can floating data centres meet AI’s huge energy demand?

- Advertisement -


Can floating data centres meet AI’s huge energy demand?

A prototype of Panthalassa’s floating data centre

Panthalassa

The data centres powering the AI boom already use more electricity than some small countries, and the International Energy Agency projects that their demand could reach 945 terawatt-hours a year – more than Japan’s entire electricity consumption – by 2030. AI is so power-hungry that companies are exploring the idea of putting data centres in space, where they could draw on constant solar energy. But one start-up thinks the solution is here on Earth – just not on land. Panthalassa is building autonomous floating data centres that will put computing power out in the middle of the ocean.

The Oregon-based company, which announced $140 million in funding last week, says its platforms could bypass overwhelmed electrical grids and deliver carbon-free computing in international waters. But beyond the technical and engineering challenges involved, it is unclear whether moving computing power offshore would actually ease data centres’ biggest bottlenecks – doing so may just replace familiar problems with far more expensive ones.

Read more

How to fix computing’s AI energy problem: run everything backwards

“Wave power is an old technology and it can work, but the ocean is a harsh environment,” says Jonathan Koomey, a former researcher at Lawrence Berkeley National Laboratory in California and an expert in data centre energy consumption. “The salt and the waves are effective at causing trouble for machinery.”

Shaped like a golf ball sitting on a tee, Panthalassa’s floating data centres are 85 metres tall – about the height of Big Ben – and made of plate steel. They are hauled into the water by a boat, then self-propel to their designated locations. There, they generate their own electricity and run AI workloads without a grid connection, emissions or engines.

The “tee” portion of the platform contains a long tube that is open at the bottom. As waves lift and drop the structure, sea water is pushed through the tube and up into the “ball” portion, which is hollow and mostly filled with air to make it float. The moving water spins turbines that generate electricity, which powers onboard graphics processing units, other computing hardware and satellite communications equipment.

Free newsletter

Sign up to The Daily

The latest on what’s new in science and why it matters each day.

Sign up to newsletter
New Scientist. Science news and long reads from expert journalists, covering developments in science, technology, health and the environment on the website and the magazine.

Ordinary data centres use vast amounts of water to cool their AI hardware. Since Panthalassa’s servers are housed in sealed modules below the water surface, the container wall itself will act as a heat exchanger, with heat dissipating into the surrounding cold water. Ocean currents and mixing will disperse the waste heat, though potential effects on nearby marine ecosystems remain unclear.

Panthalassa is attempting something few data centre operators have tried before: running critical computing infrastructure beyond the easy reach of human technicians. “Our data consistently names power and networking as the top two root causes of data centre outages,” says Jacqueline Davis at the Uptime Institute, a global authority for data centre performance. “These can each be uniquely difficult to manage in a remote environment with little to no staff.” Panthalassa didn’t respond to New Scientist’s questions before this article went to press.

According to Davis, automation in data centre environments is largely limited to monitoring and analytics, with human physical intervention still being quite common, “especially in abnormal incidents, like when cooling compressors require manual restarts”.

Read more

The critical computer systems still relying on decades-old code

This could prove to be one of Panthalassa’s biggest challenges. Latency will be another. The data processed in the floating platforms will be transmitted back to users on land by Starlink satellites, which offer limited bandwidth and higher latency compared with fibre-optic cables. This makes the nodes most practical for AI workloads that receive a job, let it run for hours or days, then return the result – like training advanced models or running scientific simulations. But most AI applications used by consumers, like chatbots and search assistants, depend on fast response times and constant network communication.

“Today’s power constraints are landing most acutely on the large AI training data centres,” says Davis. Panthalassa’s approach will be more viable if the total power needs of running trained AIs grow enough to rival those of AI training, he says. Until that happens, floating data centres are likely to have a hard time competing with those on land.

Although Panthalassa’s technology is unique, its idea of moving data centres off land isn’t. Aikido Technologies is developing floating data centres integrated into offshore wind platforms, and Mitsui O.S.K. is studying ship-based computing systems powered by marine energy sources. Earlier experiments, including Microsoft’s underwater Project Natick, tested whether placing servers in or near water could improve cooling and efficiency.

For now, though, offshore computing remains largely experimental. Beyond the engineering challenges, companies must still prove that ocean-based systems can compete economically with conventional data centres connected to power grids and fibre networks. “There are economies of scale to building data centres, which is why they are getting so large nowadays,” says Koomey. “They build big ones to spread fixed costs over more compute. It’s a lot harder and more risky to build big compute installations on the water than on land.”

Topics:

  • oceans/
  • AI
- Advertisement -

Latest articles

Related articles

error: Content is protected !!