Project Natick - Microsoft Underwater data center
The Project Natick team (left to right) who created Microsoft's underwater data center: Eric Peterson, Spencer Fowers, Norm Whitaker, Ben Cutler and Jeff Kramer. Microsoft

It may not be as deep as Jules Verne envisioned in his iconic work "20,000 Leagues Under The Sea," but Microsoft is embarking on a deep sea adventure by placing data centers at the bottom of the ocean in a bid to save money — and the environment.

In Verne's book, the figure of 20,000 refers to the distance traveled under the water. However, the author does mention depths of 4 leagues, which equates to 16 km (10 miles) — the deepest the submarine in Verne's novel travels below sea level — but even this is far deeper than anything Microsoft is planning with Project Natick.

Microsoft's vision is to see self-contain data centers placed hundreds of feet below sea level removing the biggest cost of running these operations on land — air conditioning. Using the naturally cold environments of the ocean, Microsoft hopes to help continue to boost the adoption of cloud computing by businesses around the globe.

"The vision of operating containerized datacenters offshore near major population centers anticipates a highly interactive future requiring data resources located close to users," Microsoft says on its Project Natick website. "Deepwater deployment offers ready access to cooling, renewable power sources, and a controlled environment."

According to figures from market research company IDC published last week, worldwide spending on public cloud services is estimated to grow by a 19.4 percent compound annual rate over the next four years, to $141 billion in 2019, from $70 billion in 2015.

Microsoft, which is competing with Amazon, IBM and Google for enterprise customers with its Azure public cloud offering, began Project Natick in 2014 with the first data center being deployed in August last year. Dubbed “Leona Philpot” after a character from the popular Halo videogame, the capsule — which was filled with pressurized nitrogen — featured a single rack of servers and heat exchangers clamped to the hull in order to regulate the temperature inside. The capsule was sunk near the California town of San Luis Obispo and operated for over 100 days at a depth of 30 feet.

While the initial trial was connected to land in order to provide power, the company is already looking at the possibility of using renewable energy to power the devices, making them completely self-sustaining. Microsoft is now planning trials near existing wave energy generation grids.

While the environmental and economical benefits of underwater data centers are obvious, another big benefit of launching data centers off the coast would be improved latency, the speed at which data travels between its source and destination. “Half of the world’s population lives within 200km of the ocean so placing datacenter offshore increases the proximity of the datacenter to the population dramatically reducing latency and providing better responsiveness,” Microsoft said.

The Redmond-based company foresees these capsules — which are fully recyclable — typically remaining underwater for five years at a time. They would then be brought to the surface, have the servers replaced, and sunk again for another five years. The entire lifespan of the capsule is estimated to be 20 years, after which it would be recycled and a new capsule would take its place.

Microsoft says that Moore's Law — which predicted that the number of transistors on computer chips will double in number every two years — is coming to an end and therefore the update cadence of server hardware will slow, potentially allowing for its underwater capsules to remain untouched and out of sight for up to a decade. "We see this as an opportunity to field long-lived, resilient datacenters that operate 'lights out' — nobody on site — with very high reliability for the entire life of the deployment, possibly as long as 10 years," Microsoft said.