Training a modest machine-learning model uses more carbon than the manufacturing and lifetime use of five automobiles

Oh, man. Unbelievable cluelessness coming from the MIT Tech Review, of all places.

It takes all that carbon to research and develop a breakthrough advance like LSTM.

Once the breakthrough is made, everyone can use it in a thousand different ways.
Merely training and tuning that model uses less power than your air conditioner does in 10 minutes.

And for those unfortunates who can’t afford a GPU - places like Kaggle offer them endless GPU time for zero dollars and nothing cents.

Compared to other industries, AI Research and Development has one of the smallest carbon foot-prints. Any other field - say biotech - uses more carbon just during the morning commute of its clerical staff.

5 Likes