New Experimental Tracker for Carbon Footprint of AI Projects: Study
Wed, April 14, 2021

New Experimental Tracker for Carbon Footprint of AI Projects: Study

 

Usually, the discussion on carbon footprint is centered around factories and vehicles. However, a recent study highlights the growing problem of carbon emissions in artificial intelligence (AI). This urged researchers to develop an easy tool for tracking that footprint in AI utilization.

The novel tool for measuring the carbon footprint from AI-based systems was developed by researchers at Stanford University, a private research university in the US. The tool could determine how much electricity an AI project would use and how much that would be in carbon emissions. This innovative tool could incredibly help developers in conserving power and tweaking their projects friendly to the environment. They published their findings in the journal arXiv of Cornell University.

The Carbon Footprint of Artificial Intelligence

According to Britannica, an English-language, general-knowledge encyclopedia, carbon footprint refers to the amount of carbon dioxide emissions linked to every activity of a person or an entity, such as a business, country, or region. The term includes both direct and indirect carbon emissions from an activity. Direct emissions cover fossil-fuel processing while indirect emissions correlate electricity consumption. This means that the majority of human activities have a carbon footprint.

Despite the large carbon footprint of commercial and industrial sectors, the residential and transportation sectors are notable contributors as well to the growing problem. In developed countries, the latter two sectors can comprise the biggest chunk of the footprint. The US, as an example, has been found with about 40% of its total emissions in the first decade of the 21st century accounted for the residential and transportation sources.

The Digital Age – the sudden shift from traditional industry to information technology – created the path to advanced technologies, which are AI, big data, and robotics, among others. But little is known to the amount of electricity these technologies consume. One of the big power consumers is AI, and its projects focused on machine learning. This has been identified by previous studies.

A study published in 2019 revealed the significance of AI projects in carbon dioxide emissions. The authors compared the average consumption of various human activities to projects training an AI system. The estimated power consumption of human activities was 1,984 pounds of carbon dioxide emission equivalent for air travel of one passenger, from New York to San Francisco, 11,023 pounds from an average person for a year, 36,156 pounds from an average American life for a year, and 126,000 pounds from a single lifetime of an average car.

On the other hand, the estimated power consumption of the graphics processing unit (GPU) training an AI model was 39 pounds of carbon dioxide emission equivalent. But that would surge to 78,468 pounds if tuning and experimentations were involved. Those numbers only covered a model of natural language processing or NLP. If the model was a big transformer, the consumption would be 192 pounds without significant activity. If the transformer project involved neutral architecture searches, the consumption would balloon to 626,155 pounds – the largest consumption in the comparison.

 

 

A Tool for Measuring Carbon Footprint of AI Projects

At Stanford, a research team developed an easy-to-use tool to determine how much electricity AI projects are consuming. The same tool would tell developers the equivalent of that consumption to carbon dioxide emissions. This would create an opportunity for them to make possible changes to reduce emissions. It might even give them a selling point if they successfully turned their projects friendlier to the environment than others.

"As machine learning systems become more ubiquitous and more resource-intensive, they have the potential to significantly contribute to carbon emissions. But you can't solve a problem if you can't measure it. Our system can help researchers and industry engineers understand how carbon-efficient their work is, and perhaps prompt ideas about how to reduce their carbon footprint," said Peter Henderson, the lead author of the study and Ph.D. student in computer science at Stanford.

The primary reason why AI projects consume lots of power is because of experiments during training. With an enormous amount of data, training can take weeks or months to complete. Within that period, the model consistently consumes electricity, especially the main processor and the GPU, to identify and analyze data. After the training, the model is now ready for pilot tests but its power consumption in applications is lower, compared during training. As such, the costs of training can be detrimental to corporations and organizations investing in AI.

In order to achieve precise measurement of carbon emissions of AI projects, researchers started with measuring the consumption of one AI model. The task was complicated than imagined because the model would be trained in multiple sessions. And if that model was controlled by one machine, it might be running with several other models at once. It would require individual sessions to be separated from one another. Even if the sessions could be separated individually, each of those shared overhead functions, which would demand proper allocation.

 

 

After measuring the power consumption, researchers focused on translating the information into a carbon footprint. To do that, they searched and examined data sources on energy mix around the globe. In addition to real-time data, they also discovered the influence of location on an AI being trained. For example, an AI session in Estonia would generate 30 times more carbon emission volume compared to a session in Quebec. Estonia relies on shale oil while Quebec relies on hydroelectricity.

Furthermore, the tool identified the biggest consumers of electricity among AI models. Certain machine learning algorithms could hog more power than others. At Stanford, over 200 students in a class on reinforcement learning were instructed to apply common algorithms for an assignment. Three algorithms were implemented and out of those, two performed equally well yet one consumed more power than the rest. Thus, the efficiency of an algorithm would directly affect its power consumption.

During their research, the team identified the biggest factor that could increase or decrease the carbon footprint of an AI model: the source of electricity. To help reduce electricity consumption and carbon footprint at the same time, AI models should be trained in areas powered by renewable energy sources. Since these sources do not burn fossil fuels, the consumption of training sessions would not produce carbon dioxide. This could be done easily because datasets used for training could be stored in cloud servers.

The experimental tool for tracking the carbon footprint of AI projects is still at the infancy stage. But AI developers and researchers around the world can now access it online to try its capabilities.