AWS Says It’s 100% Renewable - But Should Software Engineers Be Doing More?
In July 2024, Amazon announced it had met its goal of 100% renewable energy seven years early. An incredible achievement by one of the world’s largest conglomerates - so much so that you might think from reading that headline, choosing AWS for your cloud computing ticks the sustainability box and you don’t need to give another thought to it.
The lack of tangibility of software means it’s easy to turn a blind eye to its unrestrained power demands and subsequent greenhouse gas emissions. Data centres alone account for 1% of global greenhouse gas emissions.
At the moment, generating just two images with an AI model uses as much energy as a smartphone charge. And it’s predicted that electricity consumption from data centres, AI and cryptocurrency could double by 2026.
Software is sneakily ballooning our power demands and necessitating huge numbers of servers to be continually manufactured.
For many software companies, cloud computing is the norm. And while Amazon announcing that 100% of its energy is renewable might lead to you to think you don’t need to do anything to combat the massive power draws from software, I want to take a look behind the headlines, compare the big three cloud providers in their sustainability efforts, and show why software engineers have the opportunity to make an impact themselves.
What do ‘100% renewable energy’ headlines mean?
We hear a lot about cloud providers being run on renewable energy, but in reality they’re probably using offsets or some other method which means this statement can be misleading.
AWS shared this article in July 2024 announcing that it has met its goal for 100% renewable energy seven years early. Seeing this, it can be easy for software engineers to think we don’t have to do anything to our code, because we can leave it up to the data centre.
However, that isn’t the full picture. If you scroll down, it says AWS is actually just matching 100% of its energy usage with renewable energy. While still a great achievement, it doesn’t mean no emissions are being produced at all.
What’s happening behind the scenes?
To offset their huge energy usage, data centre providers are signing Power Purchase Agreements (PPAs). PPAs help bring new renewable energy projects online, and help organisations claim their operations are sustainable.
What is a PPA?
A company forms an agreement with an energy provider to invest in a renewable energy project such as a wind or solar farm. Amazon, Microsoft and Google are some of the biggest corporate buyers of renewable energy in the world, utilising PPAs.
Some PPAs might be built close to the data centre and feed energy directly into the grid. However, often the PPA location is completely different to the data centre, so the generated electricity goes into the local grid.
PPAs are helping get a lot more renewable energy produced, but they do not lead to fully decarbonising the grid due to the carbon accounting used not being granular enough. When electricity supply and demand is matched between the operator and the company investing in the project on an annual basis, the buyer can claim they have matched 100% renewable energy. But in reality only 40 - 70% of emissions are matched due to the mismatch in when renewable energy is produced and the electricity demand of the buyer.
An issue with PPAs is that because wind projects can only produce energy 60-70% of the time, and solar only about 50%.
An option to achieve more accurate matching is 24/7 PPAs. This means the power agreement also includes energy storage. This makes the investment much more expensive. Google has committed to all its PPAs being 24/7 by 2030.
As a software engineer with products in the cloud, where do you fit in?
While data centres are putting in a lot of work into becoming more sustainable and moving towards fully renewable energy, we can’t just leave it up to them.
Data centres, hardware and software all make up the digital footprint, and we need to make changes holistically.
Even with the sustainability initiatives from hyperscalers, their emissions are still increasing at an alarming rate. Google’s emissions have climbed 48% in the last five years, with AI demands being a major contributor. If we continue to increase our demand for cloud computing capacity, it makes it a lot harder to switch to renewable energy sources, not to mention the increase in embodied carbon to manufacture more servers. It’s important that the way we approach building software becomes more sustainable.
Carbon Reporting Tools
To change our approach to building software, the first step is being able to measure your carbon emissions.
Can you actually find out how much carbon emissions your software is creating? It depends.
Cloud providers are responding to more and more consumer demands to provide emissions data for its cloud computing, but the current offerings are varied.
AWS
AWS’s carbon footprint measuring tool is called the ‘Customer Carbon Footprint Tool’. You can view emissions by service and AWS region, and it shows how your emissions will reduce over time as AWS works to use more renewable energy. This level of detail is an improvement from when the tool was first released in 2022 when the data was grouped into high level regions such as Europe and North America. However, the tool does not show emissions per resource, such as emissions from a particular EC2 deployment, which makes it difficult to understand which of your products are causing inefficiencies.
AWS does not provide any carbon specific recommendations as part of this tool, but you can use cost recommendations in the AWS Cost Explorer and optimisations recommendations in AWS Trusted Advisor to reduce spend. (Cost is related to carbon emissions, but it’s not a 1:1 proxy - see https://asim.dev/articles/cost-carbon-paradox/)
Azure
Azure’s provides customers with access to the Emissions Impact Dashboard. It provides more granular insights than AWS, and also provides actionable insights on how to optimise your resources.
Azure also has API access to its carbon emissions data so you can export data for other uses.
Google Cloud
Google Cloud’s Carbon Sense Suite is made up of three components.
The Carbon Footprint Dashboard allows you to view granular emissions data by project, product and region.
The Region Picker enables users to include emissions as a factor when deciding which region to use for their resources. Users can compare carbon footprint, cost, and latency against each other to find the optimal region.
The third component is Active Assist. This tool scans your instances for idle workloads and provides recommendations for removing unused instances to save power.
Transparency on emissions scopes
Scope 3 emissions include the emissions produced from manufacturing and disposing of hardware. When running software in the cloud, this is a massive component of your emissions, and probably something you want included when trying to visualise your impact.
Azure and Google Cloud both make it possible for cloud computing customers to track their scope 3 emissions, but not AWS. AWS announced in 2023 that scope 3 emissions would be available through the Carbon Footprint tool in early 2024, but nothing has yet been announced.
The Struggle With Emissions Data
You’re trying to collect data on your emissions, but you need more information. How can we actually make changes if we don’t know what we’re dealing with?
While I believe we should continue to push for more detailed and granular carbon reporting, there’s a few factors that contribute to making this difficult.
Shared infrastructure and networking
While using cloud computing resources is much more efficient than on-premise resources, sharing resources between users makes it difficult to apportion out usage and accurately reflect carbon emissions of all aspects of software to users.
Different calculation methods
There are two emissions calculations methods that are used worldwide, market based and location based. Both are allowed under the Greenhouse Gas Protocol, which is a standards body for helping track towards climate goals.
The market based method calculates emissions taking into account the renewable energy that is purchased by the organisation through offsets or other investments. Therefore, the organisation can say its emissions are lower than the absolute emissions.
The location based method calculates emissions based on the energy physically provided to the company based on the makeup of the local energy grid.
Google Cloud provide both market and location based calculations to users and AWS only provides market based emissions. It’s not clear which options Azure provides.
So, it’s difficult to compare properly between providers.
Delayed reporting
AWS emissions data is delayed by three months, Azure is delayed one month. I couldn’t find data on how delayed Google Cloud’s data is, but due to its more in-depth tools, including some hourly calculations, it is able to provide a lot more on-demand data.
The delays are due to needing to gather data from different sources. It’s easier to provide a delayed calculation, rather than revising it once new data has arrived. This accuracy is good for reporting the carbon footprint of a company, but not helpful for fine tuning and enabling customers to make the small changes that all add up to create the efficiency changes we need.
What can you do?
When we bring to light the climate impact of the technology industry, we all need to transform our approach of building and maintaining software so that we can continue to benefit from technological advances, but without the environmental cost.
The best place to start is by getting as much data as you can. Check out third party tools like Cloud Carbon Footprint and Climatiq. Share this data with your team. And then, start making changes to reduce your energy emissions.
Header photo: Photo by Anders J on Unsplash