As Data Storage needs Increase, how do we Reduce our Carbon Footprint?

Peasoup

In 2014, the Carbon Trust, working with BT and The Guardian noted that 92% of emissions produced by BT’s big data were ‘outside of its control’. By 2020, Nature Communications was publishing a report on the decline of natural environments and the eerie correlation with the exponential growth of data – from the floppy disk to 40 zettabytes of data in 30 years.

Whilst we can’t attribute all naturalistic decline to big data, as data centre numbers increase, the energy needed to power them increases. The association is clear. That power most frequently comes directly from the grid and the majority source is currently fossil fuels (40% in 2022). The impact of data centres on the grid has become more apparent, in fact in 2022 several housing schemes had to be shelved as there was no grid support for further housing as a direct result of the number of data centres nearby.

Why not build data centres in cold places if they need continual cooling? Well, they need to be connected to a large power supply and able to transfer the data in and out of the servers to act as the cloud. If locating in the highlands of Scotland is out of the question, motorway networks near high data-consumption industries (e.g., fintech) are the most practical build locations.

The potential for latency increases with time and distance; in the first 14K bytes, latency is affected by a DNS lookup, a TCP handshake, and the secure TLS negotiation. A single website asset can affect latency; hence it is the most important statistic when deciding a data centre build location because it is the most end-user visible. Typically, a user evaluates a websites value proposition in ten seconds, and on mobile devices, 53% leave a website after less than 3 seconds – the value of low latency is in the billions, not millions.

4.9 billion people used the internet in 2021. The power consumption of a single data centre is equivalent to thousands of homes, but without question they are a necessity. Increasing numbers of devices with more power and data storage, plus the migration from 4G to 5G means that the internet would not function without them. But what is the solution for reducing their impact on the environment? 

A data centre could potentially have a 25-30MW data capacity and consume around 30GWh of energy per year. In reality, 30GW can power more than 10 million homes for a year. It’s not an exact science of course, the amount of energy consumed depends on many variables but you get the picture.  

Unless controls or regulations are implemented, this exponential growth will continue. You will not be surprised to learn that there are solutions out there. Heat can be captured and re-used, diverted into projects such as greenhouses, leisure centres, housing projects. Real world examples include Amazon’s partnership reusing the waste heat from the neighbouring Westin Building Exchange at its Seattle headquarters.

But this means further investment, collaboration, partnership. Such projects can reap rewards to local communities, as well as the planet, as the above examples highlight. Perhaps this will become the rule rather than the exception? 

Using liquid cooling for microchips has been in existence for many years on a smaller scale. Often collocated in air cooled data centres, servers are stripped back and placed in a dielectric fluid within a sealed unit where they are kept at a constant temperature. The fluid can be cycled through a heat exchange and used for external projects, or simply recycled back into the units. 

PeaSoup.cloud, an expert in liquid cooling offers a range of options for data storage such as full back-up and recovery, cloud-based infrastructure and is 100% sustainable. Martin Bradburn, PeaSoup CEO explained: “Liquid immersion cooling allows data to be stored with greater stability in a smaller space with greater efficiency when pitted against air-cooled solutions. The efficiencies are beneficial for all stakeholders – cheaper and cleaner for users, companies, and the planet. Not a traditional stakeholder but liquid immersion cooling is a huge benefit to mother nature, and by extension everyone else.”

Technological advancements have made liquid cooling a wiser investment as it is an easily scalable technology, it’s affordable (after start-up costs) and it reduces carbon emissions by up to 45% vs air cooled data centres. Power usage effectiveness (PUE) is critical to a data centre. Liquid immersion cooling results in a PUE range of 1.10 – 1.15. Air cooling results at 1.5 (at best).

From a pure physics point of view, liquid is much more efficient at removing heat than air.  Physical contact with the dielectric liquid allows higher wattage for CPUs and GPUs per stack or cabinet or row.  For a sustainable alternative to air-cooled data centres, and a greener data centre future, liquid immersion cooling is a huge step in the right direction for the environment.

Partner Resources

Popular Right Now

Join us on the journey to a sustainable future!

Join thousands of other industry professionals, receive our weekly newsletter filled with the latest content, innovations and updates on our talks. Don’t miss out, sign up now!

Others have also read ...

article

2019 – 2020 What – Where – Why

Edge computing relying on location, latency and bandwidth has increased with IOT demands. It is not an instead of but complimenting traditional Enterprise facilities, colo and cloud to get closer to the data source or end users. Where 5G is rolling out enterprise opportunities will follow along with edge facilities. Edge growth in other regions will be more of a steady increase until their network is upgraded

Click to View