• Medientyp: E-Artikel
  • Titel: Timeshifting strategies for carbon-efficient long-running large language model training
  • Beteiligte: Jagannadharao, Akshaya; Beckage, Nicole; Nafus, Dawn; Chamberlin, Scott
  • Erschienen: Springer Science and Business Media LLC, 2023
  • Erschienen in: Innovations in Systems and Software Engineering
  • Sprache: Englisch
  • DOI: 10.1007/s11334-023-00546-x
  • ISSN: 1614-5054; 1614-5046
  • Schlagwörter: Software
  • Entstehung:
  • Anmerkungen:
  • Beschreibung: <jats:title>Abstract</jats:title><jats:p>Language models play a vital role in various natural language processing tasks, but their training can be computationally intensive and lead to significant carbon emissions. In this study, we explore the effectiveness of timeshifting strategies to mitigate the environmental impact of long-running large language models (LLMs). We develop a simulation tool that estimates carbon emissions for LLMs, enabling developers to make informed decisions prior to running their workloads. By leveraging historical carbon intensity data from WattTime, we investigate the potential benefits and limitations of timeshifting in different locations, considering diverse energy profiles. Our findings demonstrate that timeshifting can substantially reduce emissions, but it is highly dependent on the region’s carbon intensity and energy mix. We present insights into the trade-offs between emissions reduction and workload runtime, acknowledging the need for further advancements in carbon-aware computing practices. Our research contributes to the growing field of sustainable computing and encourages developers to adopt environmentally conscious strategies in language model training.</jats:p>