Skip to main content

Carbon footprint: The important role DevOps plays in sustainable computing

Image Credit: Dynatrace

Presented by Dynatrace


It’s no secret that cloud computing has grown rapidly over the past decade. Organizations have steadily moved from single cloud solutions to multiple clouds, from a few applications to thousands of applications consisting of millions of microservices. Several surveys show growing awareness of IT carbon footprints among the general public, but a gap remains between awareness and action when it comes to IT sustainability.

A survey from Accenture found that while companies see clear value in having an integrated strategy, a lack of solutions and standards (40%), complexity (33%) and awareness of unintended consequences of technology (20%) are barriers to reaching their ESG goals. This gap between intent and action leads companies to make trade-offs between business and sustainability goals. With the increasing magnitude and volume of data and workloads teams now manage in the cloud, the cost calculations for the carbon footprint of computing are shifting.    

Carbon footprint is a measure of the amount of carbon dioxide and other carbon compounds emitted due to the consumption of fossil fuels by a particular person or group. Increasingly, organizations recognize the importance of measuring carbon footprint to become more efficient and cost-effective.

In an essay published by the MIT Press Reader, anthropologist Steven Gonzalez Monserrate noted the cloud computing industry accounts for 2% of global greenhouse gas emissions. From always-on data centers to cooling systems and redundancy regimes, he notes that “The cloud now has a greater carbon footprint than the airline industry.” It’s hard to see the problem improving as advancements in AI technology such as generative AI and large language models (LLMs) only increase the need for computing resources.

While there is growing awareness of the carbon footprint problem associated with the IT industry, many organizations are taking significant steps to address the problem. Semiconductor companies are investing heavily in chip design that enhances energy efficiency for processors and memory models. In addition, hyperscalers such as Amazon Web Services, Azure, Google Cloud and other cloud providers are continually finding exciting new ways to maximize energy efficiency through operational optimization. However, as demand for computing resources continues to grow and becomes more distributed among multiple cloud environments, it’s evident that organizations need a more unified and comprehensive approach to monitoring, measuring and optimizing the carbon impact of their computing consumption. And, unexpectedly, the key to truly minimizing compute-related carbon emissions might lie in a software organization’s ability to organize and manage their cloud environments.

Using software to get a handle on managing compute carbon footprints

The intersection of software development and data management is an important opportunity to gain insight into an organization’s rising carbon emissions. DevOps teams have always been concerned with efficient workloads and optimizing data to enhance application performance. But optimizing resource usage is also a concrete strategy to significantly reduce computing resources and minimize environmental footprints.

Software and data management involve several practices that can help optimize the resource usage and efficiency of software systems, including:

  • Rightsizing and consolidating virtual machines, containers and server infrastructure to avoid overprovisioning and waste of resources
  • Optimizing containerization and orchestration to streamline resource usage and reduce the number of underutilized virtual machines
  • Load balancing and auto-scaling to ensure resources are used efficiently based on traffic and demand
  • Optimizing and automating continuous integration/continuous deployment (CI/CD) to minimize unnecessary builds and deployments
  • Writing optimized code to minimize database queries and reduce CPU and memory usage

These practices can help reduce the carbon footprint of software systems by lowering the energy consumption and emissions of hardware and infrastructure. Moreover, they can also enhance the performance, reliability and security of software systems for a boost in sustainability and quality.

However, software and data management also face some limitations and challenges that require a more advanced and comprehensive approach, including:

  • The need for real-time data and granular asset mapping compared to historical data. Real-time data can provide more accurate and timely feedback on the current state and behavior of software systems while granular mapping can help identify the specific components and elements of software systems that contribute to carbon emissions, such as servers, databases and networks.
  • Regulatory pressure from government organizations and agencies. The EU has recently introduced the Digital Green Deal, a set of policies and initiatives that aim to make the digital sector more sustainable and environmentally friendly with measures that set carbon emission targets, impose carbon taxes, require carbon reporting and promote green digital solutions.
  • Lack of unified observability across all channels. Software systems often involve multiple channels and layers, such as cloud services, web applications, mobile applications, IoT devices and more. Each channel may have its own data sources, metrics, tools and formats, making it difficult to have a unified view and analysis of the carbon emissions across each.

With the distributed and dynamic nature of modern IT ecosystems, manually aggregating carbon usage data and calculating optimizations becomes not only time-consuming but also susceptible to human errors. The sheer volume of data, the redundancy of interconnected components, and the ever-evolving nature of applications makes manual compilation and optimization an increasingly complex task. As a result, precious time that could be spent on innovation and actionable tasks is often consumed in repetitive and sometimes inefficient workload management.

This underscores the pressing need for advanced tools that can automate much of this process. By leveraging automation and intelligent analytics, IT teams now have the tools to track their carbon footprint and draw conclusions from their carbon usage. These tools not only enhance efficiency but also ensure that the optimization process is both consistent and accurate, paving the way for a more sustainable and efficient digital infrastructure.

By using such tools, software developers, operators, managers and stakeholders would be able to leverage the power of software and data management to achieve their sustainability goals and reduce their environmental impact. The meticulous development and management of software and data stand as unsung heroes in the endeavor to make our digital world more eco-friendly.

Introducing a unified observability approach to managing IT carbon footprint

The best way to tackle these problems is to take a unified observability approach to managing IT carbon footprint that integrates monitoring data across hybrid, multicloud environments. This approach can allow teams to be more creative with managing their IT carbon footprint, using AI to provide accurate analysis so teams can automate energy-saving measures into DevOps practices.

An example of this unified observability approach is Dynatrace Carbon Impact. Developed in collaboration with the Sustainable Digital Infrastructure Alliance (SDIA) and building on formulas from the open-source initiative Cloud Carbon Footprint, Dynatrace Carbon Impact provides a seamless approach to aligning key consumption metrics with their carbon footprint contributions. Specifically designed for hybrid and multicloud environments, it offers real-time analytics and recommendations that support carbon-reduction initiatives. It accurately translates various utilization metrics like CPU, memory, disk and network I/O into their CO2 equivalent (CO2e).

But what makes a carbon impact solution actionable and automatable is its ability to pinpoint areas where systems can reduce carbon emissions and provide all the necessary details to make informed and impactful decisions. With these detailed, high-fidelity insights, teams can automate workload optimization and optimize carbon consumption by business objective. This paves the way for organizations to make better power-consumption decisions, making for a more sustainable digital future.

In an era where the carbon footprint of cloud computing is increasingly pertinent, solutions like Dynatrace Carbon Impact provide a crucial real-time insight into carbon efficiency. Solutions like these can help organizations gain a centralized view into the carbon impact of all their computing environments. Access to AI-driven analysis plus the ability to track the carbon footprint of custom business cases gives teams the tools they need to get creative about how to solve their carbon consumption issues even as their businesses grow. This observability-based approach not only presents actionable and automatable insights through granular analytics but also translates these into tangible carbon-reduction initiatives.

Addressing the carbon emissions of cloud computing is not the job of an individual company, cloud provider or software developer; it’s everyone’s responsibility. While developers are pioneering the path toward sustainable computing practices, achieving meaningful progress requires collective action and robust industry-wide collaboration.

Gregor Rechberger is Senior Principal, Business Analytics at Dynatrace.


Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.