The Royal Society report ‘The Digital Technology and the Planet: Harnessing computing to achieve net zero report’ sets a roadmap for maximising data and digital technologies’ role in building a low carbon economy and a green recovery from COVID-19. According to the report a key part of that roadmap is green computing, and this can be achieved by what it terms as proportionate computing.
Digital technologies rely on a complex infrastructure of cables, fibres, computers, data centres, routers, servers, repeaters, satellites, radio masts and energy needed to perform their functions. Building and operating these systems requires energy, and, if computing systems are to be widely deployed as digital infrastructure for managing activities across sectors, the energy demands and emissions from them will need to be understood and managed.
“The digital infrastructure is growing, as is demand for computing,” Professor Andy Hopper, chair, digital technology, and the planet working group and vice-president of the Royal Society, says. “While recent attempts to estimate the carbon footprint of the Internet have prompted headlines about how many emissions are generated with each daily digital interaction, there is a range of such estimates, and it can be challenging to calculate the extent to which emissions from digital technologies present a challenge to overall efforts to achieve net zero.
“The development of data-driven systems that truly contribute to net zero will require better evidence and guidance about their own footprints including all scopes, and their energy proportionality, as well as new approaches to lower their energy consumption and towards greater integration of renewable energy sources.”
Understanding the impact of digital technology
Recent years have seen growing interest in the amount of carbon generated by digital technologies and the extent to which this poses a threat to sustainability efforts. While attracting significant media attention, studies in this area have presented a range of different estimates of the contributions to global emissions made by digital technologies, with these varying from 1.4 to 5.9 per cent of global greenhouse gas emissions. As a comparison, maritime transport is responsible for about 2.5 per cent of global emissions.
Studies estimate that emissions from digital technologies come in large part from the electricity they consume whilst in use, and in smaller but still substantial part from ‘embodied’ emissions incurred during the life cycle of these technologies, from raw materials extraction, through to manufacturing, distribution, end-of-use recycling, and disposal of materials. When considering user devices only, such as smartphones and laptops, the share of embodied emissions reaches approximately 50 per cent. The reason is, compared with networks and servers, user devices consume less power in their use phase, as they are only used for parts of the day, but they are replaced often, especially smartphones.
Several tech companies have realised the scale of the issue and taken steps towards circularising the tech industry, meaning they pledged to reuse equipment and generate less waste – for example, Google partnered with the Ellen MacArthur Foundation to circularise its business; Fairphone produces modular mobile phones with recyclable parts. One of the largest technology renewal centres in the world is run by HPE in Erskine, Scotland.
“All tech companies and manufacturers need to further promote and implement best practice towards reducing the emissions associated with the manufacture of digital technology,” Hopper adds.” Individuals also have a part to play, as well as policy. For example, the European Commission facilitated an agreement among major mobile phone manufacturers to adopt a common charger with micro-USB connectors, thus reducing e-waste.
Energy used during computing
Energy demands from the operation of digital technology come from both the power consumption of local devices and from the electricity needs of infrastructures underpinning the internet – networks and data centres. According to a report earlier this year from the International Energy Agency ‘Data centres and data transmission networks’ data transmission networks represented about one per cent of global electricity use in 2019 (around 250 TWh), with mobile networks accounting for two-thirds of this one per cent.
Data centres, where most of the servers of the planet are concentrated, host internet platforms and process and store much of the data generated in everyday activities. These centres account for a large proportion of the electricity consumption from computing. According to a report in the journal Science global data centre electricity demand in 2019 was estimated to be around 200 TWh, or around 0.8 per cent of global electricity demand.
“To achieve net zero, the tech sector, alongside most other sectors of the economy, will need to achieve actual zero emissions, rather than resort to offsetting schemes,” Hopper continues. “There are already examples of companies and collaborative initiatives that are setting voluntary efficiency and emissions targets. In February 2020, the tech and telecommunications industry agreed a target to reduce greenhouse gas emissions by 45 per cent between 2020 and 2030, while tech giants have recently made bold pledges about their respective carbon footprints.”
However, assessing the progress of the sector towards these targets will require access to good quality, reliable data about its emissions and energy use. The relatively few studies available about the energy consumption of digital systems reveal variations in the sources of data used, with some studies relying on private, unpublished data.
A range of technology advances have enabled data services to continue growing, whilst the energy consumption of data centres has kept constant. Promising new research directions points to ways in which the energy demands from digital systems could be further reduced as demand for their use continues to rise. Changes to personal behaviours is one of the market forces that can shape this demand.
Energy-efficient computing
Digital technology equipment has become increasingly energy efficient. Progress in hardware has meant that the number of transistors in a dense, integrated circuit has doubled about every other year, an empirical observation known as Moore’s law. This has allowed the telecommunications and tech industry to exponentially increase chips’ performance and speed without corresponding increases in their power consumption.
Research into new types of low power processors is needed to continue to push the limit of hardware energy efficiency. Recent developments include brain-inspired neuromorphic hardware and new AI accelerator chips, such as Apple’s A14 chip, which claims to achieve ten times faster machine learning calculations.
“Organisations have achieved further energy efficiencies in data infrastructures by moving their data storage and processing from local servers to the public cloud,” Hopper explains. “Moving computing to the cloud has allowed more efficient patterns of server use. Centralisation of these servers allows more effective management, with the servers’ load being optimised so that they do not consume energy while idle. Illustrating the scale of the issue, a 2017 survey of 16,000 enterprise servers revealed that a quarter of them were entirely idle, consuming energy but performing no useful computing operation.
“However, the best on-premises data centres can be currently as good as the average public cloud provider, with utilisation levels on the cloud reaching only 40 per cent on average versus 30 per cent on-premises – suggesting there is significant room for improvements.”
Bitcoin provides an example of use of technology that is not energy proportionate, which should be used as part of foresight when considering other potential technology applications and associated externalities. It illustrates how rebounds can happen, with the technology driving Bitcoin mining, the manufacture, purchase and disposal of specialist hardware, and an escalation in energy use.
Supercomputer systems, often referred to as exascale computing systems, are also anticipated to enable advances in many application domains, such as climate mitigation and adaptation. Based on technology available ten years ago, scaling systems to an exaflop level would have consumed more than a gigawatt of power, roughly the output of 400 onshore wind turbines. But supercomputers have developed to consume less energy. The world’s most powerful supercomputer in the world in June 2020, the Arm-based Fujitsu Fugaku, had a reported peak power consumption of 30 MW of electricity, roughly the output of 12 onshore wind turbines.
It is often said quantum computing, an emerging form of computing harnessing quantum mechanics, promises a step-change in the complexity and rapidity of calculations, well beyond those of conventional computers. New developments could mean that the technology might become more energy efficient than conventional supercomputers in the long run. However, quantum computers are still in their infancy and highly unlikely to become widely used in the decades leading to 2050. So far, their operation has been limited and impractical, requiring the cooling of their circuits to ultra-low temperatures, which has an extremely high energy cost.
While computing infrastructure can make use of renewables, the continued expansion of renewable power assets will take time and capital. This reinforces the need for energy proportionate computing.
[/et_pb_blurb][/et_pb_column][/et_pb_row][/et_pb_section]