Training AI in a Way that Will Not Poison the Planet
Tue, April 20, 2021

Training AI in a Way that Will Not Poison the Planet

The goal of natural language processing is to make sense, decipher, and read human languages so that machines can understand it. / Photo by: arrow via 123rf

 

Natural language processing or NLP is a branch of artificial intelligence that deals with the interaction between humans and computers using the natural language. Its ultimate goal is to make sense, decipher, and read human languages so that machines can understand it. But training said language models requires an enormous amount of electricity and power. It can create real damage to the environment, but this can be changed if the academia and the industry itself will embrace greener practices.

Why Training AI Is Costly Environmentally

In a study titled "Energy and Policy Considerations for Deep Learning in NLP" published in electronic preprints repository arXiv, authors Emma Strubell and team from the University of Massachusetts Amherst’s College of Information and Computer Sciences shared that training neural networks is costly, financially and environmentally. 

One of the highlights of their study is the BERT model, which stands for bidirectional encoder representations from transformers (models that process words in relation to all other words in a sentence). BERT is a go-to language model framework when it comes to NLP tasks. Google, for instance, uses such a model in its search algorithm so that the machine can better understand the natural language queries. Because of the BERT model, Google can provide more relevant results to users of the search engine. 

The Cost of Training

BERT was trained on 16 Tensor Processing Unit (TPU) chips for 96 hours or 4 days and was also trained in the graphics processing unit (GPU). In terms of carbon emissions, that would be “roughly equivalent to a trans-American flight,” the team wrote. The environment cost was considered based on the energy required to power the hardware for weeks or even months at a time. 

They also found out that training one artificial intelligence model already produces carbon dioxide that is equivalent to nearly a lifetime of carbon emissions of five American cars.

Coauthor Ananya Ganesh explained via New York-based media company The Outline that they are hoping to get the talks started about how the community can consider the efficiency at the “cost of everything else.”

 

Researchers found out that training one AI model already produces carbon dioxide that is equivalent to nearly a lifetime of carbon emissions of 5 American cars. / Photo by: Konrad Bak via 123rf

 

Artificial Intelligence and Energy Efficiency

Quebec-based Mila AI Institute’s postdoctoral researcher Sasha Luccioni, who is working on a project that uses AI to visualize the consequences of climate change, believes that the research of the University of Massachusetts Amherst team can serve as a gateway to consider a crucial topic around artificial intelligence and energy efficiency. She added that a lot of NLP training is now done by utilizing cloud services from companies, like Microsoft, Google, and Amazon, but there are still few people who train their AI models from scratch. 

If machine learning was confined to a laboratory activity in the past, revolving around reaching a benchmark or solving a dataset, now training the models has an effect on the outside world too. Behind the laboratory, walls are topics that others may not have deeply considered. Before, it was only fairness, ethics, and bias debate. Now, there is also an energy debate. The ethical and the energy debate are related because it talks about the impact of the process in creating a flawed product.

University of Colorado Boulder’s assistant professor in the Department of Computer Science’s Daniel Larremore, for instance, asked: “By doing so many computations, are we putting a lot of carbon into the atmosphere that wouldn’t otherwise be there?”

Luccioni is also one of the authors of a study titled Quantifying the Carbon Emissions of Machine Learning. They wrote that from an environmental standpoint, there are crucial aspects when it comes to training a neural network. These factors are the:

- energy grid that it uses for training the model
- location of the server
- model and make of hardware, where training occurs

E-Waste Is Also Piling Up Worldwide

As AI evolve exponentially, AI is now seen as a tool that can help protect the environment but also presents risks of damaging it. Even conservationists have warned the public that using e-products is more intensive than using raw materials. Come to think of it, graphite, cobalt, and nickel have to be extracted to create lithium-ion batteries that are used for smartphones and electric cars. It has already damaged the environment of Canada, India, and China, where much of the said materials are extracted. Plastics are also produced worldwide to create the packaging of said products. 

McMaster University in Canada said that 85 percent of smartphone emissions impact is attributed to production. Next year, the energy consumption of a smartphone will be more than laptops and PCs, the study reads. This is because with every phone call, text message, and video downloaded or uploaded, a data center is making all these things happen. Data centers in the US alone use over 90 billion kilowatt-hours of electricity every year. That is already equivalent to 500-megawatt or 34 giant coal-powered plants. For global data centers, that would be 416 terawatts. They also consume about 2 percent of the electricity worldwide and that could rise to 8 percent by 2030.