Underwater datacentres on the menu as Microsoft announces success of Northern Isles

Earlier this summer, marine specialists reeled up a shipping-container-size datacentre coated in algae, barnacles, and sea anemones from the seafloor off Scotland’s Orkney Islands.

The retrieval launched the final phase of a years-long effort that proved the concept of underwater datacentres is feasible, as well as logistically, environmentally, and economically practical.

Microsoft’s Project Natick team deployed the Northern Isles datacentre 117 feet deep to the seafloor in spring 2018. For the next two years, team members tested and monitored the performance and reliability of the datacentre’s servers.

The team hypothesized that a sealed container on the ocean floor could provide ways to improve the overall reliability of datacentres. On land, corrosion from oxygen and humidity, temperature fluctuations and bumps and jostles from people who replace broken components are all variables that can contribute to equipment failure.

The Northern Isles deployment confirmed their hypothesis, which could have implications for datacentres on land.

Lessons learned from Project Natick also are informing Microsoft’s datacentre sustainability strategy around energy, waste, and water, according to Ben Cutler, a project manager in Microsoft’s Special Projects research group who leads Project Natick. What is more, he added, the proven reliability of underwater datacentres has prompted discussions with a Microsoft team in Azure that is looking to serve customers who need to deploy and operate tactical and critical datacentres anywhere in the world.

“We are populating the globe with edge devices, large and small,” William Chappell, vice president of mission systems for Azure, said. “To learn how to make datacentres reliable enough not to need human touch is a dream of ours.”

Proof of concept

The underwater datacentre concept splashed onto the scene at Microsoft in 2014 during ThinkWeek, an event that gathers employees to share out-of-the-box ideas. The concept was considered a potential way to provide lightning-quick cloud services to coastal populations and save energy.

More than half the world’s population lives within 120 miles of the coast. By putting datacentres underwater near coastal cities, data would have a short distance to travel, leading to fast and smooth web surfing, video streaming and game playing.

The consistently cool subsurface seas also allow for energy-efficient datacentre designs. For example, they can leverage heat-exchange plumbing such as that found on submarines.

Microsoft’s Project Natick team proved the underwater datacentre concept was feasible during a 105-day deployment in the Pacific Ocean in 2015. Phase II of the project included contracting with marine specialists in logistics, ship building and renewable energy to show that the concept is also practical. “We are now at the point of trying to harness what we have done as opposed to feeling the need to go and prove out some more,” Cutler said. “We have done what we need to do. Natick is a key building block for the company to use if it is appropriate.”

Algae, barnacles, and sea anemones

The Northern Isles underwater datacentre was manufactured by Naval Group and its subsidiary Naval Energies, experts in naval defence and marine renewable energy. Green Marine, an Orkney Island-based firm, supported Naval Group and Microsoft on the deployment, maintenance, monitoring and retrieval of the datacentre, which Microsoft’s Special Projects team operated for two years.

The Northern Isles was deployed at the European Marine Energy Centre, a test site for tidal turbines and wave energy converters. Tidal currents there travel up to nine miles per hour at peak intensity and the sea surface roils with waves that reach more than 60 feet in stormy conditions. The deployment and retrieval of the Northern Isles underwater datacentre required atypically calm seas and a choreographed dance of robots and winches that played out between the pontoons of a gantry barge. The procedure took a full day on each end.

The Northern Isles was gleaming white when deployed. Two years underwater provided time for a thin coat of algae and barnacles to form, and for sea anemones to grow to cantaloupe size in the sheltered nooks of its ballast-filled base.

“We were pretty impressed with how clean it was, actually,” Spencer Fowers, a principal member of technical staff for Microsoft’s Special Projects research group, continued. “It did not have a lot of hardened marine growth on it; it was mostly sea scum.”

Power wash and data collection

Once it was hauled up from the seafloor and prior to transportation off the Orkney Islands, the Green Marine team power washed the water-tight steel tube that encased the Northern Isles’ 864 servers and related cooling system infrastructure. The researchers then inserted test tubes through a valve at the top of the vessel to collect air samples for analysis at Microsoft headquarters in Redmond, Washington.

“We left it filled with dry nitrogen, so the environment is pretty benign in there,” Fowers said.

The question, he added, is how gases that are normally released from cables and other equipment may have altered the operating environment for the computers.

The cleaned and air-sampled datacentre was loaded onto a truck and driven to Global Energy Group’s Nigg Energy Park facility in the North of Scotland. There, Naval Group unbolted the endcap and slid out the server racks as Fowers and his team performed health checks and collected components to send to Redmond for analysis.

Among the components crated up and sent to Redmond are a handful of failed servers and related cables. The researchers think this hardware will help them understand why the servers in the underwater datacentre are eight times more reliable than those on land.

“We are like, ‘Hey this looks really good,’” Fowers said. “We have to figure out what exactly gives us this benefit.” The team hypothesizes that the atmosphere of nitrogen, which is less corrosive than oxygen, and the absence of people to bump and jostle components, are the primary reasons for the difference. If the analysis proves this correct, the team may be able to translate the findings to land datacentres.

“Our failure rate in the water is one-eighth of what we see on land,” Cutler said. “I have an economic model that says if I lose so many servers per unit of time, I am at least at parity with land,” he added. “We are considerably better than that.”

Energy, waste, and water

Other lessons learned from Project Natick are already informing conversations about how to make datacentres use energy more sustainably, according to the researchers. For example, the Project Natick team selected the Orkney Islands for the Northern Isles deployment in part because the grid there is supplied 100 per cent by wind and solar as well as experimental green energy technologies under development at the European Marine Energy Centre.

“We have been able to run really well on what most land-based datacentres consider an unreliable grid,” Fowers said. “We are hopeful that we can look at our findings and say maybe we don’t need to have quite as much infrastructure focused on power and reliability.”

Cutler is already thinking of scenarios such as co-locating an underwater datacentre with an offshore windfarm. Even in light winds, there would likely be enough power for the datacentre. As a last resort, a powerline from shore could be bundled with the fibre optic cabling needed to transport data.

Other sustainability related benefits may include eliminating the need to use replacement parts. In a lights-out datacentre, all servers would be swapped out about once every five years. The high reliability of the servers means that the few that fail early are simply taken offline.

In addition, Project Natick has shown that datacentres can be operated and kept cool without tapping freshwater resources that are vital to people, agriculture and wildlife, Cutler noted.

“Now Microsoft is going down the path of finding ways to do this for land datacentres,” he said.

Go anywhere

Early conversations about the potential future of Project Natick centred on how to scale up underwater datacentres to power the full suite of Microsoft Azure cloud services, which may require linking together a dozen or more vessels the size of the Northern Isles.

“As we are moving from generic cloud computing to cloud and edge computing, we are seeing more and more need to have smaller datacentres located closer to customers instead of these large warehouse datacentres out in the middle of nowhere,” Fowers said.

That’s one of the reasons Chappell’s group in Azure is keeping an eye on the progress of Project Natick, including tests of post-quantum encryption technology that could secure data from sensitive and critical sectors. The ability to protect data is core to the mission of Azure in multiple industries.

“The fact that they were very quickly able to deploy it and it has worked as long as it has and it has the level of encryption on the signals going to it combines to tell a pretty compelling vision of the future,” Chappell said.


Partner Resources

Popular Right Now

Edgecore Insight Podcast

Ep-1: Navigating the Waters of Sustainability

Others have also read ...


2019 – 2020 What – Where – Why

Edge computing relying on location, latency and bandwidth has increased with IOT demands. It is not an instead of but complimenting traditional Enterprise facilities, colo and cloud to get closer to the data source or end users. Where 5G is rolling out enterprise opportunities will follow along with edge facilities. Edge growth in other regions will be more of a steady increase until their network is upgraded

Click to View