The buzz from the COP26 climate conference which took place in Glasgow in November can still be heard, and it’s no small wonder that IT infrastructure – in particular the cloud and data centre sector - featured as a prominent part of the conversation.
Let’s take a look at some of the steps being taken by cloud providers and data centre operators to build towards a more sustainable future. We’d love to hear the steps that are most important to you to meet these challenges.
A CHALLENGE OF HYPERSCALE PROPORTIONS
The explosive adoption of public cloud and data centre infrastructure shows no signs of slowing down with facilities reaching occupancy saturation almost as fast as they can be built. The world of business is hungry for services, storage, and computing power, and in turn those sheds full of servers are hungry for energy.
It should come as no surprise that the data centre is one of the largest contributors to enterprise carbon emissions today.
In fact, for many it is the largest and will continue to be so. As a result, understanding the environmental impact of IT is becoming a critical C-Suite activity, and managing that energy consumption probably falls in the CIO’s remit.
An oft referenced white paper from IT services firm DXC earlier this year suggests that data centres are responsible for 2% of the world’s CO2 emissions - a number that is equivalent to the total emissions of the global airline industry.
This figure could crank up to 3.5% by 2025, by which time it has also been forecast that those same data centres could consume up to one-fifth of all power produced globally.
These big barns full of servers are problematic for several reasons. Whether they’re greenfield (a new build) or brownfield (a redevelopment of an existing site), their large footprint and highly specific requirements see them pushed out to rural locations and make them disruptive to the local environment.
The size of so-called hyperscale data centres is such that engineers travel the endless corridors by scooter. While the plumbing-in of power, water, and network connectivity can further impact the environment with a need for yet more fixed infrastructure.
Once populated and the lights go on, racks upon racks of servers take an inordinate amount of electricity to run - electricity that is usually generated by fossil fuels. A study in Nature estimated that globally, data centres use between 200 terawatt hours (TWh) and 500 TWh annually. This is more than the entire energy consumption of some countries.
Then of course they generate heat. Heat is bad for servers, so they need to be cooled. Along with air conditioning, water cooling is the most popular method for dissipating that heat and traditionally this has been done by wasteful methods that evaporate the water as it is used.
This in turn required huge amounts of fresh water to be pumped in - something that is not always easy, or even ethical.
Watch a quick demo of how to connect to Google Cloud using Console Connect
THE PATH TO SUSTAINABILITY
Unpredictable global events like Covid have demonstrated just how reliant the world economy is on cloud and data centre infrastructure and if anything, have further accelerated adoption.
Fortunately, data centre operators are wise to the fact that this path is unsustainable, and understand that real change is required to tackle the climate emergency.
After all, the problem is circular. For every fraction of a degree warmer the globe gets, the more effort is required to cool the data centres, which produces more emissions.
This has prompted all the big cloud providers such as Amazon, Microsoft, and Google, to committing to reducing the carbon footprints of their IT infrastructure.
Amazon Web Services (AWS), for example, is set to power its operations with 100% renewable energy by 2025. Google meanwhile, is carbon neutral today, but aims to run all of its data centres on carbon-free energy by 2030.
Such examples are found industry-wide, where there is a concerted push towards carbon neutrality.
Following an EU announcement in February 2020 that data centres can and should be carbon neutral by 2030, and that regulations could be in the offing, the industry (cloud and data centre operators) set up the Climate Neutral Data Centre Pact (CNDCP) in January 2021. This is something of a self-regulatory initiative that reports to the European Commission on a regular basis.
But beyond the avoidance of anticipated regulation, reducing the impact of the carbon footprint of data centres serves another purpose - ensuring a future presence in markets.
In late November, Ireland’s Commission for Regulation of Utilities (CRU) said that new measures would be put in place to curb the energy impact of data centres on the national grid. There had been some suggestion of an outright ban on data centres in the country, so the news was seen as a win.
Over the last four years, Irish power grid EirGrid said it has seen annual increases in demand of around 600 GWh from data centres alone - equivalent to the addition of 140,000 new households each year.
The numbers are intimidating, but there is some light amid the doom and gloom. Another study published in Nature, and since updated earlier this year, found that while data centre workloads had increased 6x between 2010 and 2018, their collective energy consumption had not increased to the same degree. In fact it had only grown 6%.
This phenomenon was credited to technological advances that improve the efficiency of data centre operations, as well as advances in green initiatives, such as energy conservation.
According to the International Energy Agency (IEA), compared to other industries, data centres have generally become more energy-efficient.
"Decarbonising aviation and heavy industry is challenging because many of the technologies we need are not yet commercially available. Decarbonising data centres is easier: power them with clean electricity and continue to push energy efficiency through policy, investment and innovation," the IEA said.
OPTIMISING AND REDUCING ENERGY USE IN DATA CENTRES
Technological and economic advancements, from economies of hyperscale, to new methods of cooling, are all helping reduce the impact of these IT infrastructure giants.
Some of the main innovations include:
- Location: As well as efficiency advancements in the way data centres are designed, cooler locations can play a part in reducing the resources needed to run data centres. It’s well documented that several of the largest data centre operators, including Google, Facebook and Amazon, have all purchased land in places like Iceland, Norway, and Sweden with the idea that building in a colder environment will help to keep the costs of cooling down. Countries like Iceland also create more than 100% of their energy from water power.
- Colocation and public cloud: As organisations transition from maintaining their own private data centres to outsourcing operations to a third party, either through migration to the public cloud or colocation, economies of scale become available. Data centre operators can take advantage of these economies of scale to maximise processing while minimising waste and pass the benefits on to their customers. IDC predicts that cloud migrations occurring from 2021–2024 will prevent 1,014 million metric tons of CO2 from being released - the equivalent of taking over 220 million vehicles off the road for a year.
- Liquid cooling: Air cooling is very inefficient and consumes a huge amount of resources, while water cooling in the past has also been wasteful. But improvements are being made where recycled, non-fresh water, or even seawater can be used. Facebook has been using a proprietary liquid cooling system called StatePoint since 2018, described as “an advanced evaporative cooling technology that produces cold water instead of cold air.” The company claims it may be responsible for reducing cooling costs by 20% in hot climates and up to 90% in colder climates. Another innovation is liquid immersion cooling, in which the server’s internal hardware is placed in a non-conductive bath of mineral oils and the liquid removes heat from the components directly, rather than from the air. This system has the added benefit of being fan-less, which can almost eliminate the noise pollution from the server room.
- Backup power: The majority of data centres have diesel generators as backup in the event of a power failure, contributing to fossil fuel usage. However, recent advances by operators like Google harness the potential of lithium-ion batteries for backup power and redistribution of that idle energy. “Whereas diesel generators sit idle most of the year, batteries are multi-talented team players: when we’re not using them, they’ll be available as an asset that strengthens the broader electric grid,” Google said in its blog.
- Shut down "zombie servers": There are a lot of legacy servers still running that are not contributing any useful function. According to IBM, some estimates put their numbers as high as 30% of all virtual servers and 25% of physical servers. The problem is, they can be difficult to find. One audit discovered more than 1,000 servers that hadn't even been configured with DNS software and were, therefore, invisible.
- Optimise where and when workloads run: Renewable energy isn't available all the time and in all places. According to IBM, some workloads only need to be run periodically or on an ad-hoc basis and it makes sense to match these workloads with available renewable energy by finding the locations and times of day when they could make use of these lower emissions energy sources.
- Computation efficiency and AI: Improvements in computation mean servers can do the same computation with less energy by embedding optimisation at each layer of the cloud stack. This could mean running a workload across a smaller set of servers and have them all work harder. By some estimates, unused server idle power can be as high as 50% of the total power draw of a server, so power management can have a big impact.In another example, Google’s Deep Mind subsidiary has successfully reduced the energy used for cooling its data centres by 40% using Artificial Intelligence. By analysing workflows in real-time, AI can reduce the power usage of idle infrastructure, leading to cost-savings of up to 9%.
- Make use of empty offices: UK open source advocacy company OpenUK pitched a Net Zero Data Centre Blueprint during COP26. The proposal looks at reducing the footprint of the hyperscale data centres that are on trend and making more use of empty retail or office spaces vacated during Covid as smaller data centres.
As environmental regulations evolve and businesses demand greener solutions, data centre, cloud and network operators must build sustainability into their strategy.
Technological advancements and careful consideration of the environmental impact of operations, will heavily contribute to the future of data centres and likely the planet.
OVER TO YOU
What are the key practices and changes you value to make sure infrastructure and technology can continue to give users the value they need, while staying sustainable?