Share
Tags
Cloud ComputingData CenterExponential TechnologyHardwareHigh Performance ComputingInfrastructure

Is it a good idea to run data centers underwater?

Adriano MarquesNovember 24, 2020

There are major infrastructural challenges in running large scale data centers. One is providing sufficient electric power to keep the facility running. A datacenter running tens of thousands of servers consumes roughly 10 Megawatts of power.
Servers not only consume vast amounts of energy, they also generate a lot of heat.

The air inside a data center will become sweltering unless you cool it down. Servers cannot function reliably in high temperatures. The cooling solution needs to be both highly effective at taking heat away from the hardware, and efficient in its own power consumption. Otherwise it would substantially increase your operating costs.

The servers also take up lots space and need to be housed in a suitable facility. And finally, you need to hire people to service the equipment. There are various creative solutions to the challenge of efficiently cooling hardware in a data
center. One of them is using the cold air of the arctic. Datacenters have been built close to the north polar circle in Iceland and Scandinavia, where the space can effectively be cooled by simply opening some windows. Another attractive feature of Iceland is abundant sources of relatively cheap electric power. However, building datacenters in distant regions introduces latency concerns.

The demand for compute power is growing with more and more workloads running in the cloud and at the edge of the network. One way to meet this demand is to build numerous smaller data centers closer to the clients. About half of the world's population lives in coastal areas and locating data centers in coastal waters could reduce network latency.

In 2014 Microsoft launched a research effort named Project Natick with a goal of eventually deploying production grade sustainable data centers under water. They designed a watertight shell casing shaped like a tube. Racks of serves were installed inside, and the server capsule was lowered to the seabed a mile off the US Pacific coast for about four months. Heat exchangers attached to the outer shell were used to cool the servers inside.

In order to make its underwater data centers fully sustainable, Microsoft plans to use wind turbines and tidal electric generators on the surface for power. In Phase 2 of Project Natick 864 servers were installed on twelve racks in a shell 40 feet long and 9 feet in diameter. After the servers were installed inside the capsule, oxygen was pumped out of it and replaced with dry nitrogen to prevent corrosion of the hardware and eliminate the risk of a fire.

The new capsule, named Northern Isles, was submerged 117 feet under the sea near The European Marine Energy Centre in Scotland, UK. This is a research center focused on generating power from ocean waves and tides located near Orkney Islands off the northern tip of Great Britain. Northern Isles was deployed under water for two years and resurfaced in July of 2020. Microsoft
announced that their under-sea data center was up to eight times more reliable than datacenters on land. Hypothetically, the dry nitrogen atmosphere inside the pod and elimination of any accidental damage by service technicians are factors contributing to improved hardware reliability.

In Phase 3 of Project Natick Microsoft plans to build a docking station that would house twelve data pods and be lowered and surfaced using ballast air containers. The pods are designed to be in service for 20 years. In future deployments, Microsoft will to surface the pods every five years to upgrade the hardware. In this mode of operation, the servers would not require any warranty, which would potentially lower hardware costs.

There are security concerns about data stored in a capsule suspended in the ocean waters. A big data pod would probably prove difficult to physically snatch away. However, in case someone with enough resources desired to cause damage, they could attempt to destroy the underwater data centers. Someone, wishing to cause a disturbance, could attempt to damage the connecting cables onshore. Would this be a concern for the owners of the data stored underwater? Would state governments feel safe storing critically important data that way? To address some data security concerns, research is being done in the area of post-quantum encryption technology.

With all the concerns and challenges in mind, results of the Project Natick experiment are impressive and promising, and in the future, we might see substantial amounts of data stored and processed in fully sustainable lights-out underwater data centers.