Training a modest machine-learning model uses more carbon than the manufacturing and lifetime use of five automobiles

Allow me to assure you that I am completely kidding when I say they could fix this problem if they found a funding source for their research other than mining bitcoin.

5 Likes

In my state, hydroelectric power accounts for over 2/3rds of production.

Another 15+% is other renewables and nuclear.

So I feel pretty okay letting my servers run 24/7. At least from an emissions point of view.

Hydroelectric has some pretty bad ecological impacts though, and we’ve been spending a lot of time and money trying to make dams and salmon compatible. Without a lot of success.

3 Likes

Very true. However, the ecological impact of hydroelectric to a specific waterway and its basin, while bad, is incomparable to the systemic global ecological impact of fossil fuels. There is no way fossil fuels should ever be preferred over other power sources.

4 Likes

There should be no way burning petroleum could take priority of its nearly miraculous uses as chemical stocks.

5 Likes

This is a (the) critical point. All these estimates have to be based on some CO2/kWh figure, which could vary from almost 0 for wind & solar, to a lot with a coal-based power source. I don’t know what value they are using, probably an average. My point would be that compute can be clean, so the conclusions only apply if efforts are not made to clean up data center energy resources.
Another point I will make is that much of commercial AI is moving toward transfer learning-based AutoML NAS’s (Neural Architecture Search) approaches, which is much more efficient overall in that Data scientists don’t have to start from scratch, users don’t have to be a Data Scientist, and they don’t have to personally try various models to find the best one.
So overall, the only point that I feel is valid from this summary of a study I have n’t read is that inefficient modelling on a dirty power grid is indeed bad. IMHO there’s not much critical academic-thinking or journalism (more FUD hype) in highlighting (“shocking!”) roundly estimated and poorly qualified huge CO2 emissions.

4 Likes

From the paper itself

Car, avg incl. fuel, 1 lifetime 126,000
Training one model (GPU)
w/ tuning & experimentation 78,468

This is a bit over half a car

4 Likes

How many times can that trained model be replicated once it works? Each car creates 1/5th as much carbon, but there are millions of mechanical duplicates running around loose. Replicating software is pretty easy, they teach it in grade school. Next step is to put the software in cars and tell it to cut emissions.

1 Like

Google cloud is 100% renewable energy, I am told. If so, then there is no carbon cost to incremental use of the installed plant.

1 Like

Oh, man. Unbelievable cluelessness coming from the MIT Tech Review, of all places.

It takes all that carbon to research and develop a breakthrough advance like LSTM.

Once the breakthrough is made, everyone can use it in a thousand different ways.
Merely training and tuning that model uses less power than your air conditioner does in 10 minutes.

And for those unfortunates who can’t afford a GPU - places like Kaggle offer them endless GPU time for zero dollars and nothing cents.

Compared to other industries, AI Research and Development has one of the smallest carbon foot-prints. Any other field - say biotech - uses more carbon just during the morning commute of its clerical staff.

5 Likes

That reminds me of this:

(Also holy crap I just noticed the image on that article is of the comic artist Kaz! Hahha!)

Right. Deep learning is used to avoid having to understand the model properly. It’s really interesting to see how deep learning models often end up performing as one of their layers some simple transform that a bit of thought could have suggested.

This reminds me of some research years ago that was estimating some effect. The researchers had an uncertain scaling factor in the estimation equation, which they called alpha. The alpha parameter was assumed to be “of order unity” … i.e., approximately equal to 1, likely to be some value between 0 and 1. Well, hundreds of research papers were written based on this estimated model, with all sorts of consequences. But, after many years, someone completed a full study of the fundamental equations and found that alpha was indeed between 0 and 1 … however, it was not nearly equal to 1, but more like 10 to the power of -9 => 0.000000001. That “correction” to the original estimate was HUGELY ENORMOUS and essentially invalidated the hundreds of follow-on research papers because their estimated impact of the studied effect was billions(!) of times too large. I have a sense that these authors may have inadvertently introduced a similarly mistaken estimator in their calculations, thus vastly over-estimating the real carbon footprint of a few billion calculations that my laptop can complete in a few hours… Hardly consuming more than several kWh of energy.

2 Likes

Kinda like Volkswagen did, only in reverse…

Yeah… but there are over a billion cars on the streets today. Doesn’t quite sound so terrifying then, does it? But I suppose if it didn’t sound terrifying then it wouldn’t attract clicks, hence the omission of context.

This all assumes what, burning coal to get that electricity to train the models?!? What an absolutely stupid assumption. Throw a bunch of solar panels on the roof and use those to train. Zero carbon. Done. Stupid article. Wind, solar, geothermal, whatever.

1 Like

I’ll make sure to indicate in Google News app “don’t show any more Boing Boing articles”. If this is journalism then Trump is a real president.

1 Like

calvin

1 Like

This topic was automatically closed after 5 days. New replies are no longer allowed.