Navigating the Environmental Impact of LLMs

Explore the environmental impact of large language models, addressing their energy-intensive training, the challenges of measuring their carbon footprint, and green strategies for mitigating their impact.

Deep learning’s environmental toll

Practical large-scale pre-training requires large amounts of computation, which is energy-intensive. The demand for deep learning has grown rapidly, and with it, so have the computational resources needed. This has significant environmental costs in terms of unsustainable energy use and carbon emissions. In a 2019 studyStrubell, Emma, Ananya Ganesh, and Andrew McCallum. "Energy and Policy Considerations for Deep Learning in NLP." In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, 2019., researchers at the University of Massachusetts estimated that training a large deep-learning model produces 626,000 pounds of planet-warming carbon dioxide, equal to the lifetime emissions of five cars. As models grow bigger, their computing needs are outpacing improvements in hardware efficiency. Chips specialized for neural-network processing, like GPUs—graphics processing units—and TPUs—tensor processing units—have somewhat offset the demand for more computing power, but not by enough.

Get hands-on with 1400+ tech skills courses.