As we discussed in our introductory post, ESG isn’t just for publicly-traded companies; companies both public and private across all industries have felt the impact of ESG on their customer relationships and financing outcomes. In this post, we’ll focus on the first letter in the ESG triad – E, for Environment, which addresses a company’s impact on the environment.
Common examples of environmental impact fall into both positive and negative categories. Historically, most investors have focused on minimizing negative impacts such as water pollution, air pollution, or habitat destruction for threatened or endangered species. Some companies are also able to present directly positive impacts, such as through “green” energy technologies or recycled material manufacturing. However, the majority of these companies have been in industries like natural resource extraction, manufacturing, or construction.
Technology + Professional Services
So how does the E in ESG apply to your organization if it’s in the technology or professional services industry? Your primary business activities likely involve developing software, executing this software on a device, and analyzing, storing, or transmitting related data – all tasks that correspond to demand for energy usage or natural resource extraction. Notably, the electricity that powers a server and the raw materials that go into devices are also key cost drivers that affect a company’s expenses and profitability.
Saving the environment might save you money, but where do you begin? For many organizations, there is low-hanging fruit to be found in two surprisingly simple concepts: clear requirements and software quality control.
Let’s start with clear requirements for data. Most organizations don’t have a purposeful plan driving what data they collect, how they store the data, and when and how they dispose of the data. By collecting the wrong data or keeping it too long, organizations create unnecessary demand for electricity or raw materials. Interestingly, many of the solutions to these mistakes also align with trends in data privacy and compliance, and are best addressed as part of a holistic data privacy program. By better-targeting customer data collection or optimizing data retention policies, organizations can cut both their costs and risks at the same time.
Software Quality Control
In other cases, these issues relate to software quality control issues. Many organizations that develop software or data products do not consider the efficiency of their data representations, transmission, or storage. As machine learning and data science activities have grown exponentially over the last decade, this issue has become even more problematic. Many developers and data scientists have made choices that seem small but whose impact is great; JSON might be convenient, but it can result in dramatically more electricity to process and store than other formats like HDF5 or Apache Parquet.
Data Science + ML Goals
Similarly, many machine learning and data science teams do not operate within the context of clear requirements for their work product. When most research teams begin the process of developing a model, there is no clear goal or metric to guide their effort. How does each percentage of accuracy, AUC, MAE, or false positive rate correspond to a business objective? Is the cost to train and deploy a deep learning model on GPUs warranted – or would a simple random forest have met the requirements with a much lower environmental and economic impact? For example, researchers at Harvard found that their machine learning model took 400% more energy to achieve a .1% increase in accuracy beyond the initial model. At the organization level, it’s best to address these opportunities through a data science maturity assessment and comprehensive data strategy, but in some cases, a specific machine learning model assessment can also provide immediate value.
Lastly, many traditional software quality management techniques can have a material impact, especially for software that operates at scale. Open source projects or organizations that release software can make a difference by carefully selecting programming languages or by using static analysis to detect unnecessary dependencies or code paths; many applications or libraries carry a large overhead of waste, and these extra bytes distributed over the network or loaded into memory add up at scale. Down the supply chain, organizations that “consume” software should also be conscious of their choices; for example, dynamic analysis like application or network profiling can help identify software or hardware that might be inefficient or misconfigured. In general, these efficiencies are best captured when addressed as part of a top-down software development maturity assessment or technology maturity assessment.
As environmental considerations become more important for customers, employees, and investors, organizations will increasingly be asked to explain and measure their approach. This trend is best exemplified by the SEC’s recent communications regarding climate change, including a recent sample disclosure letter that clarifies regulatory expectations. While this is currently only applicable to SEC-registered companies, it is a strong indication of coming capital market requirements. Those who take action to get ahead of these concerns – especially where the effort aligns with other key opportunities like data privacy and data science – will find themselves ahead of the pack.
Continue reading this series as we move on to cover the S in ESG: Social or skip ahead to G for Governance.