Chassis-level immersion cooled solution meets requirements for edge data processing

[et_pb_section fb_built=”1″ _builder_version=”4.4.5″][et_pb_row _builder_version=”4.4.5″][et_pb_column _builder_version=”4.4.5″ type=”4_4″][et_pb_blurb image=”data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMTA4MCIgaGVpZ2h0PSI1NDAiIHZpZXdCb3g9IjAgMCAxMDgwIDU0MCIgeG1sbnM9Imh0dHA6Ly93d3cudzMub3JnLzIwMDAvc3ZnIj4KICAgIDxnIGZpbGw9Im5vbmUiIGZpbGwtcnVsZT0iZXZlbm9kZCI+CiAgICAgICAgPHBhdGggZmlsbD0iI0VCRUJFQiIgZD0iTTAgMGgxMDgwdjU0MEgweiIvPgogICAgICAgIDxwYXRoIGQ9Ik00NDUuNjQ5IDU0MGgtOTguOTk1TDE0NC42NDkgMzM3Ljk5NSAwIDQ4Mi42NDR2LTk4Ljk5NWwxMTYuMzY1LTExNi4zNjVjMTUuNjItMTUuNjIgNDAuOTQ3LTE1LjYyIDU2LjU2OCAwTDQ0NS42NSA1NDB6IiBmaWxsLW9wYWNpdHk9Ii4xIiBmaWxsPSIjMDAwIiBmaWxsLXJ1bGU9Im5vbnplcm8iLz4KICAgICAgICA8Y2lyY2xlIGZpbGwtb3BhY2l0eT0iLjA1IiBmaWxsPSIjMDAwIiBjeD0iMzMxIiBjeT0iMTQ4IiByPSI3MCIvPgogICAgICAgIDxwYXRoIGQ9Ik0xMDgwIDM3OXYxMTMuMTM3TDcyOC4xNjIgMTQwLjMgMzI4LjQ2MiA1NDBIMjE1LjMyNEw2OTkuODc4IDU1LjQ0NmMxNS42Mi0xNS42MiA0MC45NDgtMTUuNjIgNTYuNTY4IDBMMTA4MCAzNzl6IiBmaWxsLW9wYWNpdHk9Ii4yIiBmaWxsPSIjMDAwIiBmaWxsLXJ1bGU9Im5vbnplcm8iLz4KICAgIDwvZz4KPC9zdmc+Cg==” _builder_version=”4.4.5″]

The global market for edge data centres is expected to nearly triple to $13.5 billion in 2024 from $4 billion in 2017 according to research from PWC. The potential for these smaller, locally located data centres to reduce latency, overcome intermittent connections and store and compute data close to the end user is gaining in traction with the growth of smart devices, and with the pace of adoption hastened by the arrival of 5G.

Edge data centres are smaller facilities located close to the populations they serve that deliver cloud computing resources and cached content to end users. They typically connect to a larger central data centre or multiple data centres. By processing data and services as close to the end user as possible, edge computing allows organisations to reduce latency and improve the customer experience.

 Two of the growing use cases for edge data centres are autonomous vehicles and smart cities. Self-driving vehicles can collect, process, and share data in real time, making transportation safer. In smart cities the real-time gathering and analysis of data on traffic, utilities, and infrastructure allows city officials to immediately respond to problems.

 Cooling challenges for edge data centres

Currently, about ten per cent of data is created and processed outside a centralised data centre or cloud, but by 2025, this figure will reach 75 per cent according to figures from Gartner. This growth of smaller localised processing would appear to be at odds with the sustainability drive that is being met through efficiencies gained at large scale facilities. One of the most significant challenges faced is cooling.

“The typical edge deployment may not have the infrastructure in place to adequately cool IT equipment inside a limited number (1 – 20ish) of enclosures (cabinets),” Herb Villa, senior applications engineer, Rittal, says. “While it may be possible to install humidity controls and air filtering systems into the space, if the space even exists, to better manage the environment, the cost and complexity make this an impractical choice.

“A simpler, cost-effective alternative is liquid cooling – the go-to solution for individual rack power densities of 20 – 30kW or a complete installation of up to 200-300kW.  Densities that could far exceed any existing climate control capacities, and with power densities varying from rack to rack (both common in edge deployments). There are several liquid cooling methods out there, with the most effective for edge are closed loop systems. These systems feature racks and heat exchangers that work exclusively with one another, closer to the equipment than most other methods (except for direct-to-chip) to minimise air flow and improve efficiency.”

Adopting a scalable and modular solution

Another option would be to develop an integrated chassis-level immersion cooled solution and this is the path being followed by the Iceotope, Avnet and Schneider Electric partnership that seeks to jointly develop chassis-level immersion cooled data centre solutions. This partnership has recently been joined by Lenovo to deploy its ThinkSystem SR670 servers in a highly scalable, GPU-rich, liquid-cooled micro data centre solution.

Sealed at chassis level, the new solution enables artificial intelligence (AI), machine learning (ML) and high performance computing (HPC) workloads to be deployed in close proximity to the location of data generation and use regardless of how harsh or hostile the environment. Integrating the Lenovo ThinkSystem SR670 2U rack server with Iceotope’s Ku:l Chassis eliminates the requirement for any air cooling, delivering improved efficiency in energy consumption. Figures from Iceotope show that 95 per cent of the heat is captured and rejected via an in-rack heat rejection unit (HRU) (5kW solution) or dedicated external HRU (46kW+ scalable solution).

“This integrated, immersion cooled solution brings highly intensive and efficient compute capability that ‘drop into’ applications from the edge to large scale HPC’s,” Steven Carlini, VP Innovation and Data Centre, Schneider Electric said. “The ability to bring these ready-to-deploy liquid cooled solutions to market proves the strength of the partnership.”

According to David Craig, CEO Iceotope, the infrastructure to provision edge expansion will be installed, where space provides, outside traditional data centres. “We call this the fluid edge,” he adds. “Iceotope is dedicated to ensuring the durability, reliability, efficiency and long-term viability of fluid edge facilities, where air cooled approaches have a limited future. Partnering with Lenovo to bring the Ku:l Micro DC to life has accelerated our capability to provide a proven and warranty-backed, chassis-level immersion cooled HPC design solution to this expanding market.”



Partner Resources

Popular Right Now

Edgecore Insight Podcast

Ep-1: Navigating the Waters of Sustainability

Others have also read ...


2019 – 2020 What – Where – Why

Edge computing relying on location, latency and bandwidth has increased with IOT demands. It is not an instead of but complimenting traditional Enterprise facilities, colo and cloud to get closer to the data source or end users. Where 5G is rolling out enterprise opportunities will follow along with edge facilities. Edge growth in other regions will be more of a steady increase until their network is upgraded

Click to View