Training a modest machine-learning model uses more carbon than the manufacturing and lifetime use of five automobiles

Originally published at: https://boingboing.net/2019/06/07/extinction-by-nlp.html

3 Likes

Conclusions
Authors should report training time and
sensitivity to hyperparameters.


Academic researchers need equitable access to
computation resources.


Researchers should prioritize computationally
efficient hardware and algorithms.

The point to studies like these is to show that better care should be taken with the consumption of resources, even computational ones. And that large sources of waste can be hidden in places you don’t often think about.

And now I await shitty Facebook memes about how AOC will have to ban computers from my uncle.

17 Likes

Brain 1 - Computer 0.

10 Likes

Roger That!

4 Likes

I would think carbon pricing or similar would enable the market to make any necessary corrections. If the ML model in question has more social utility than three cars, train away. If it doesn’t, it’ll be priced out of business.

Which sounds both glib and naïve, and I’m sure it’s both, but I do think markets are good for some things. The devil being in the details of course, and the important detail here is getting the carbon pricing right – and not having it totally screwed up by corporate capture of government, so maybe this is a case of “if we had ham we could have ham and eggs if we had eggs”.

5 Likes

How does that compare to the carbon cost of making and training a human being to, say the age of 12?

15 Likes

Even worse: for many of the applications they’re being used for in industry, a linear regression would work just as well.

7 Likes

Ah, there’s the rub. Even the most aggressive climate pricing policies in place right now are an order of magnitude too small:

11 Likes

I have a deep learning algorithm that will help us price carbon correctly…

19 Likes

… computer science researchers investigate the carbon budget of training machine learning models for natural language processing, and come back with the eyepopping headline figure of 78,468lbs to do a basic training-and-refinement operation.

They found that the process of building and testing a final paper-worthy model required training 4,789 models over a six-month period. Converted to CO2 equivalent, it emitted more than 78,000 pounds and is likely representative of typical work in the field.

One of these is not like the other.

8 Likes

Now I want a study on how much carbon a human generates by the time they learn to be productive members of society. I know by the time I was college age my parents had gone through about 4 cars driving me around.

2 Likes

Yes, exactly, a “modest” machine learning model is one you can build on a single commodity desktop, usually in a matter of minutes or hours, not 6 months.

It’s definitely not a cutting edge, 5,000 trainings, research paper worthy model (which by the way has nigh infinite reuse) like BERT or GPT-2.

This is FUD, pure and simple.

9 Likes

Potentially infinite reuse, maybe. But when was the last time you saw a model described in a NeurIPS paper actually re-used for anything at all?

3 Likes

How much carbon was required to produce this study I wonder?

2 Likes

Well that makes sense because a Transformer is an AI and an automobile.

image

14 Likes

Cory’s story says that 78,000 pounds of co2 is the equivalent of the lifetime of five cars, but the story he links to says that 626,000 pounds is the equivalent of five cars.

Even the bigger number seems low though. A quick google says a car emits about 12,000 pounds a year, which means five cars would hit that number in just 10 years of emissions, not including manufacturing.

4 Likes

Seriously… At first I thought, wait I’m pretty sure we got 95% of the way there with transformer design decades or longer ago. What kind of esoteric materials and shapes are they computing together? Then I went to, “More than meets the eye, huh?”

Explaining things to me in terms of pounds of carbon is pointless. You may as well be talking of a conversion into gold pressed latinum. How about kilowatts of energy? They are dealing with computers, I have a computer and it uses electricity too. Then we can go down the rabbit hole of carbon pounds per energy source.

1 Like

Even after all the nuclear accidents and failures, I realize that nuclear power needs to come back. We need to implement safer designs, socialize it’s production, and waste. This will be a stop gap measure until we get fusion online. Energy consumption is not likely to come down. Until then we need to turn the pollution of energy consumption from an externality to and internality with a cost prohibitive carbon tax. We have a meter. We need the tax.

2 Likes

The carbon per kwh varies massively depending on where your servers are too. Iceland has too much geothermal energy, and no really good way of transmitting that energy, so it’s becoming something of a hotspot for server farms. (Something of an explosion at the time of bitcoin madness, but I think that it’s becoming more sane now.)

Elsewhere, depends on local mix of fossils and nukes. Annoyingly some European countries have gone nuclearphobic, but import electricity from neighbours who have surplus energy because they have nuke plants.

3 Likes