Covid-19 has changed the way the world works. To begin with this was through necessity; offices and enclosed workspaces of all kinds were potential hotbeds for the virus, which was spreading across the globe with terrifying tenacity. But as we move into the final quarter of 2021, with various vaccines rolling out at various speeds internationally and public spaces tentatively reopening under a variety of restrictions and regulations, countless offices and workspaces remain closed for the foreseeable future. Many more are operating at massively reduced hours, with staff coming in at staged intervals on different days to avoid unnecessary contact.
All these changes have seen a massive growth in the use of data-intensive cloud services in all for consumers since the start of 2020. Video conferencing tools like Microsoft Teams, Zoom, and Google Meet, along with video streaming services and online gaming platforms, have all seen massive user growth over the course of the pandemic and its ensuant lockdowns as vast numbers of people moved their personal and professional lives into their homes full time.
All of which is to say that the global demand for computing power offered by data centres has reached soaring heights and is continuing to grow rapidly.
And data centres need water. A lot of water.
Weighing data needs against sustainability costs
In the United States, a typical data centre uses around three to five million gallons of water every single day in order to keep temperatures in-house at operational standards for the massive computational power they are capable of generating. The higher the rate of data-intensive activity they are required to process at any given time, the more water is used to maintain those standards. Annually, this can lead to almost two billion gallons of water used by an individual data centre, equivalent to the amount used by a city of 30,000 – 50,000 people.
Water footprints of this size would clearly have an impact wherever they were generated, but with companies like Google, Facebook, Amazon and Microsoft all either currently operating or planning data centres in water-scarce regions like Texas, Arizona and California in the US, and with data centres already operating in similarly dry regions internationally there is a real and growing concern that competition for water supply with local communities in these areas will grow and calcify over the coming decades.
“The concern about large volumes of water usage in data centre cooling come largely from areas where water is in short supply and natural water resources such as aquifers are being depleted by a number of requirements,” Mark Acton, critical support director at Future-tech, a specialist in data centre engineering, investment, design, project management and operations services, explains.
“This is particularly in some US states where there is increasing concern over aquifer depletion which may take thousands of years to recover. The fact that cooling using evaporative cooling towers, which consume large volumes of water, has historically been probably the most prevalent type of cooling in US data centres exacerbates this problem significantly. The use of cleaned and filtered water fit for human consumption is also wasteful due to the energy and materials spent on producing water of this quality.”
On top of this, a general lack of understanding about the methods and scale of water resource use in data centres is threatening to create a culture of unfamiliarity around the exact nature of the issues at hand.
As Acton puts it, “Governments, NGOs and the media in particular need to better understand what truly constitutes data centre energy and resource efficiency. If the industry is to become truly sustainable this understanding must improve, and consideration must be given to the provision of digital services generally rather than merely the overheads that data centres may impose.”
The complex draw of water-scarce regions
What is it that draws big data and processing facilities to areas of water-scarcity in the first place? Why, when regions rich in the kind of clean water so valuable to data centre processes exist in abundance, are we seeing ever more ambitious plans being debated in regional governments for data centre developments in some of the world’s most arid regions?
“The real draw is the fact that drier air, such as that found in deserts, is easier to cool than humid air,” Ernest Sampera, co-founder and CMO of data centre solutions firm vXchnge, says. “Data centres are drawn to locations that offer reliable power sources – preferably renewable such as solar that would be abundant in the arid regions – and stable climates.”
The situation the presents the industry with something of a double-edged sword. Facilities are able to make use of the climate and corresponding weather conditions in dry, hot areas to maintain high efficiency in water-cooling their resources, but these regions offer strictly limited useable water reserves to begin with, and often already overcrowded local demands on those reserves. So how does the industry move forward knowing that future demands for data centre processing are only going to continue growing rapidly across the globe?
“Data centre cooling not only needs to be reliable but effective, and for extremely long periods of time lest businesses descend into chaos. To avoid that potential outcome, data centre cooling solutions need to get bigger, more effective, and much cheaper,” Sampera suggests.
But in a world already feeling the urgent strain of climate change, and where there is a growing awareness of the impact water access and distribution will have on human lives as temperatures begin to fluctuate ever more dramatically and consistently, is it enough to rely on hopes of more for less?
If so, where does the responsibility for this theoretical development lay?
Regulation, standardisation, and oversight
In January of this year, the Copenhagen Centre on Energy Efficiency released a brief outlining two new liquid-cooling techniques being employed in China by regional data and tech powerhouses Tencent and Alibaba.
The brief notes that ‘liquid cooling still faces many challenges in the development process. There is an urgent need to promote the development of technology and industry by strengthening industry guidance, standardising the evaluation system, and improving the industrial ecosystem, among other measures.’
The two liquid-cooling techniques are then outlined – these being refined versions of immersion liquid cooling and cold-plate liquid cooling – with particular attention being drawn to both systems’ flexibility and speed of implementation, as well as the significantly reduced overall levels of energy consumption each method affords over similar processes utilised around the rest of the world. All of this, the brief notes, comes as a direct consequence of a strict and far-reaching Chinese governmental policy to regulate energy consumption as overseen by the Ministry of Industry and Information Technology (MIIT).
As well as this, a system of data centre sustainability ratings were introduced by a number of industry standard organisations such as the Open Data Centre Committee (ODCC) and the Green Grid Committee (TGGC) to regulate the accepted requirements for present and future operational sustainability across the country.
The combined impact of these industry oversights, which have now been in effect for almost a decade, is that PUE (Power Usage Effectiveness) standards have soared.
The brief outlines a number of the ongoing improvements at major data centres that have come about as a direct consequence of the industry standards, including the Tencent Qingpu Trigeneration Data Centre which adopted technologies such as natural-gas trigeneration, centrifugal frequency conversion chillers, and magnetic levitation chillers to achieve an annual PUE of 1.31; and the Alibaba Winter Olympics Cloud Data Centre which employed non-elevated floor-diffused air supply and fully automated building automation systems to maximize the use of natural cooling sources, achieving an annual PUE of 1.23.
Finite capacity vs. exponential need
So, is government and NGO oversight alone enough to guarantee the long-term sustainability of the data centre industry worldwide? And can we be sure that broad adoption of metrics like PUE can occur without them becoming little more than tools for greenwashing data centre water usage while we wait for gradual innovation to pick up the slack?
“There are lots of different technologies that allow us to recapture water and reuse it. But fundamentally they add expense, and that water is consumed and becomes tied up,” David Craig, CEO of cooling solutions firm Iceotope, says. “The industry has focused on this and is developing alternatives, but they only go so far, large amounts of water are still consumed. Just because your technology has recapture and reuse capacity does not mean you are consuming no water.”
The complicated nature of the situation is one that Anna Merloni, EMEA communications manager at digital infrastructure provider Vertiv, is keen to pick up on. “With an eye on the UN 2030 Agenda, water usage is one of a myriad points to work on,” she explains. “It is a complex balance of energy efficiency, economical sustainability, water saving, social responsibility and environmental sustainability. Everything is connected.
“Each datacentre should be clearly classified in terms of sustainability. Proper metrics must be defined as well as all corrective actions to make plants more and more efficient, sustainable, and carbon neutral. A clear and objective classification is the starting point for all data centre operators to take their own responsibility and help drive a green digital economy.”
On paper it makes sense to place the responsibility for adapting to changing global water demands in the hands of the companies responsible for data centres, and there can be no doubt that ensuring that they are meeting these challenges represents good business sense. There is nothing to be gained from annihilating the water tables on which they and so many others rely. So, the question becomes one of capacity, rather than intent; with existing infrastructure, can we ensure that data centre water consumption is and remains ecologically future proof?
Are we ready for the requirements of tomorrow?
The view from the other side
In the concluding analysis of the Copenhagen Centre on Energy Efficiency’s brief on data centre development and oversight in China, it is noted that the issue of water consumption and liquid cooling will need to be confronted in a two-stage process, if it is to be done with any assurance of maintaining true water neutrality. Adaptation of existing infrastructure in the short term, coupled with a complete overhaul of standards on any and all future technologies in the field.
It states that there is an urgent need for relevant industry-standard organisations to manage the standardisation of liquid cooling, establishing clear technical requirements for all aspects of liquids, power supply and distribution, control, security, and operations and maintenance to promote the positive development of liquid-cooling technology.
This is a belief echoed across most of the marketplace.
“Water is a limited resource for the planet. We can’t just use salt water, so what are the things we can do?” Craig adds. “In the short and medium-term, there is going to have to be work done to convert existing data centres for better capture and better minimisation of the inefficiency that sits within them. But as we move forward into the medium and long term, we are going to have to build data centres from the ground up, which are water efficient.”
It would seem, when everything is weighed up, that the way ahead for the industry to ensure that it meets its rightly lofty goals of true water neutrality are clear. What remains to be determined is whether or not the will and organisational capacity exist to match the rapidly developing technical improvements that are redefining the limits of what water cooling is capable of.
As Sampera puts it, “Workloads in data centres are only going to get more intensive. Cryptocurrency and blockchain workloads, for instance, have already proven that and as server form factors become denser and able to do more within smaller designs, it’ll be important to continue innovating.”
After all, only between one and two per cent of the water on Earth is usable, by humans and by data centres. One way or another, we will need to learn to share.