As Europe lurches towards another lockdown, the vital nature of the digital infrastructure backbone is becoming ever more apparent. With working from home becoming the new norm, the advent of greater mobile streaming with 5G, the continued evolution of artificial intelligence (AI) and big data in industry and the continued growth of social media, the use of data will continue to grow.
Several research and innovation challenges are still to be solved to deliver the potential of digital technologies in achieving a net zero economy and society. These include better integrating energy and digital systems, prototyping data infrastructure, developing trustworthy digital systems, green computing, enabling nature-based and engineering- based mitigation approaches, understanding drivers influencing societal transformation and distributing fairly the cost and benefits of the transition to net zero.
Data, in conjunction with technologies such as artificial intelligence and digital twins, should be at the heart of the net zero transition. Achieving the promise of this data-led net zero transition will require an ambitious, collaborative and challenge-led research and innovation effort.
As we enter a new year it is customary to look back on the achievements of the past 12 months and here we highlight three exemplars, amongst many, within the data centre sector: climate pledge, data Callow management and high-performance computing.
Microsoft and the carbon pledge
One of the most ambitious climate pledges in the technology sector so far comes from Microsoft, who announced that by 2030 they would be carbon negative, and by 2050 they would remove from the environment all the carbon the company has emitted either directly or by electrical consumption since it was founded in 1975. Importantly, the pledge to become carbon negative by 2030 includes supply chain emissions (Scope 3), which represent a large share of the company’s emissions. For 2020, Microsoft expects to emit 100,000 metric tons of Scope 1 carbon, four million tons of Scope 2 carbon, and 12 million tons of Scope 3.
To become carbon negative by 2030, Microsoft pledges to reduce emissions across its business by more than half and plans to remove more carbon than it emits annually as a company. In support of this, Microsoft announced a $1 billion Climate Innovation Fund to accelerate the global development of carbon reduction, capture, and removal technologies. To incentivise change across all part of its business, Microsoft had an internal carbon tax introduced in 2012, increased to $15 per metric ton of carbon in 2019, and expanded from Scope 1 to Scope 2 and 3 in 2020. From 2021, the company is also planning to make carbon reduction an explicit aspect of its supply chain procurement processes.
Deepmind for the cooling of data centres
Data centres, facilities comprising of rows upon rows of servers, generate a lot of heat. Over the last decade there have been several improvements in making the cooling of data centres more energy efficient. In 2016, Deepmind achieved a step change using machine learning and AI systems to manage the cooling of Google’s data centres, helping save up to 40 per cent of the energy needed for cooling. To achieve this, Deepmind researchers used historical data from a data centre – such as temperature, power, and pump speeds – to train an ensemble of deep neural networks. In doing so they trained the neural networks to optimise the ratio of the total building energy usage to the IT energy usage (also known as Power Usage Effectiveness, or PUE).
Google then successfully implemented AI control systems that can operate with minimum supervision to regulate the heating and cooling of its data centres. It achieved this level of autonomy by constraining the system’s optimisation boundaries to a narrower operating regime, thus prioritising safety, and reliability. These autonomous AI control systems can learn from data and improve over time. They came up with unexpected solutions such as taking advantage of winter conditions to produce colder than usual water.
The world’s most powerful and efficient supercomputer
The world’s most powerful supercomputer is also world-leading in terms of energy efficiency, and it was set up to deliver societal value. Based at the RIKEN Center for Computational Science in Kobe, Japan, the Arm-based Fujitsu Fugaku was developed to address high-priority social and scientific issues, part of Japan’s Society 5.0 plan. The computer is due to begin full operation this year, with a breadth of areas of application including weather and climate forecasting, energy creation, storage and use, the development of clean energy, and new materials development.
The Arm-powered supercomputer ranked as the most efficient in the world in November 2019, achieving 16.9 GFlops/Watt power efficiency. In June 2020, it also became the world’s most powerful supercomputer, delivering 415.5 petaflops, i.e. 2.8x more than the second-most powerful supercomputer, IBM’s Summit, using the same benchmark. Fugaku’s reported peak performance is over 1,000 petaflops (1 exaflops).
To create a future in which the power of digital technologies is harnessed to support global wellbeing and human flourishing, action is needed to support the development and deployment of digital-enabled solutions to urgently reduce emissions at scale. This will require integrating data and digital technologies into decision-making processes to better manage systems, as well as developing trustworthy technologies to support decision- making and providing guidance on their implementation.
To develop the core digital capabilities for net zero and new applications of digital technologies to a range of sectors, it will be necessary to develop stakeholder-led solutions to key challenges, highlight areas of opportunity and direct funding to them.[/et_pb_blurb][/et_pb_column][/et_pb_row][/et_pb_section]