How Green Is Your Software?

Illustration by Ricardo Tomás

Without doubt, software is the backbone of virtually all the intelligent solutions designed to support the environment. It’s critical, for example, in efforts to tackle deforestation and reduce emissions. In many instances, however, software is also part and parcel of a rapidly growing carbon footprint. In fact, recent and proliferating digital technologies have begun to worsen many of the environmental problems they are aimed at solving. But companies can make software an integral part of their sustainability efforts by taking its carbon footprint into account in the way it is designed, developed, and deployed and by rethinking some aspects of how the data centers that provide cloud-based services operate.

Let’s be clear: On its own, software doesn’t consume energy or emit any harmful discharge. The problem lies in the way software is developed for use — and then in the way it is used. Software runs on hardware, and as the former continues to grow, so does reliance on the machines to make it run.

For example, blockchain drives some of the most advanced green solutions available such as microgrids that allow residents to trade environmentally friendly energy. And this software innovation is also behind the development of cryptocurrency. In 2019, researchers at the University of Cambridge estimated that the energy needed to maintain the Bitcoin network surpassed that of the entire nation of Switzerland.

Then there’s the information and communications technology sector as whole. By 2040, it is expected to account for 14% of the world’s carbon footprint — up from about 1.5% in 2007.

The very development of software can be energy intensive. For example, consider what we learned when we trained an artificial intelligence (AI) model on a small, publicly available dataset of iris flowers. The AI model achieved accuracy of 96.17% in classifying the flowers’ different species with only 964 joules of energy. The next 1.74%-point increase in accuracy required 2,815 joules of energy consumption. The last 0.08% incremental increase in accuracy took nearly 400% more energy than the first stage.

Now consider that same example in the context of the bigger picture of AI overall. Training a single neural network model today can emit as much carbon as five cars in their lifetimes. And the amount of computational power required to run large AI training models has been increasing exponentially, with a 3.4-month doubling time.

All that said, it wouldn’t make sense to limit reliance on software as a means to enable work, especially in the post-Covid world where work from home or remote locations could become the norm for many. Nor would limiting software-driven innovation be a viable response.

However, companies can make software an integral part of their sustainability efforts by judging its performance on its energy efficiency as much as on traditional parameters (e.g., functionality, security, scalability, and accessibility) and by including green practices and targets as criteria for CIO performance reviews.

Ultimately, the rewards would outweigh the challenges: The early, increased scrutiny that building green software requires translates into a higher-quality product: leaner, cleaner, and more fit for its purpose. These qualities also offset additional upfront costs. Green software will help large companies meet their ESG targets, an increasingly important performance measure for stakeholders. Finally, our research (soon to be published) has shown that newly minted computer engineers are increasingly weighing a company’s focus on sustainability when choosing an employer; a commitment to green software can be a persuasive draw.

So how can companies go green with their software? It’s a three-part process that begins with articulating a strategy that sets some boundaries, then targets the software development life cycle, and makes the cloud green as well. No single company that we know of is engaged fully in this process as we describe it and reaping the full benefits of purposefully green software. However, a growing number of businesses — including Google, Volkswagen, and Rainforest (itself a software testing company) — are deploying a variety of the following approaches and techniques.

Articulate a strategy that guides trade-offs and allows for flexibility. Doing this will get IT teams thinking about what the right level of tolerance should be for their software’s environmental effects. There are almost always trade-offs between business and environmental goals, and software engineers need to be able to determine where the go/no-go line is. Think back to the AI model we trained on the iris flower data set. Whether that last step to increase the accuracy is worth the energy it consumes is a business decision that requires clear guidance from the top.

Equally important is that the strategy call for flexibility — allowing engineers running room to improvise and to learn through trial and error. Green software is still an emerging field, largely limited to academia. There are no guidebooks for engineers in this area.

Finally, this broad strategy should suggest the metrics needed to measure progress. For software updates, these would not be difficult to set (for example, by determining how much more energy a new version consumes than the previous version). For new software, however, useful measures would be more difficult to define. Initially they could include such measures as memory-use efficiencies, the amount of data used, and floating-point (mathematical) operations per second.

Review and refine the software development life cycle. Start by asking: What is the smallest possible environmental footprint we could make with this application? Use that expectation to guide the first stages of the software development cycle. This expectation may shift as you gain knowledge, but it can be a great help in informing the feasibility study and any assessment of trade-offs between alternate approaches.

Then develop recommendations on, for example, the algorithms, programming languages, APIs, and libraries you can draw on to minimize carbon emissions. And require constant assessment of alternatives that might be more efficient. These assessments would test the software’s compatibility across various energy-constrained hardware designs such as mobile, car, and home controls.

At the deployment stage, monitor real-time power consumption through techniques such as dynamic code analysis. The data you gather will be critical for understanding the gaps between the design choices and actual energy profiles.

Some companies are offering tools to help develop power-aware and increasingly efficient systems. For example, Intel offers developers tools and resources for managing energy consumption. The company’s Software Development Assistant allows engineers to take energy measurements from the system as it executes specific workloads within their application and determine its efficiency.

However, these sorts of tools are in short supply. Assessing key trade-offs between carbon emissions and business objectives such as flexibility is still an uphill climb.

Make the cloud green. Modern applications are almost always deployed over the cloud. But the exponential growth in cloud-based services has resulted in the rapid expansion of power-intensive data centers. Data centers consume about 2% of global electricity today; by 2030, they could consume as much as 8%.

To date, most efforts to make data centers green have focused on optimizing hardware (by reducing the incidence of overheated servers) and reducing carbon emissions (by increasing the mix of renewable energy that powers them). These techniques are helping to address the problem; however, including sustainable software interventions opens new opportunities to save energy.

For example, eliminating duplicate copies of data or compressing data into smaller chunks would save energy. So would deploying graphics-processing units to manage workloads at the edge (near the device or the end user), which creates efficiencies by breaking up large tasks into smaller ones and divvying them up among many processors.

Adopting greener server architectures will likely prove crucial for saving energy consumption. Using virtual servers, for example, would help companies scale up their servers on demand, conserving energy in enterprise data centers. Virtualization essentially enables the creation of multiple simulated environments (or dedicated resources) from a single, physical hardware system. Containerization, essentially an improvement over virtual systems, is another option. Where serverless computing separates applications at the hardware level, containerization separates them at the operating-system level.

Newer application architectures — such as serverless computing or functions-as-a-service (FaaS) — enable even more control over capacity and by extension, energy consumption. Serverless computing, for example, efficiently shares infrastructure resources by executing functions only on demand. And since it bills by execution time, it compels programmers to improve their codes’ efficiency. Large serverless computing service providers such as AWS Lambda and Microsoft Functions, for example, provide for continuous scaling with a pay-as-you-use cost model.

Whether it is the mobile phone that requires more efficient use of resources and computing power to save energy or the cloud data center where servers need to be optimized for energy consumption, the need for green software will continue to grow. By including software in your sustainability efforts now, your company will have a head start in this important area.

The authors thank Vikrant Kaulgud and Vibhu Saujanya Sharma from Accenture Labs and Shruti Shalini and Dave Light from Accenture Research for their contributions to this article.

Source Article