• Medientyp: Elektronischer Konferenzbericht; E-Artikel; Sonstige Veröffentlichung
  • Titel: Evaluation of Transformer Architectures for Electrical Load Time-Series Forecasting
  • Beteiligte: Hertel, Matthias [VerfasserIn]; Ott, Simon [VerfasserIn]; Schäfer, Benjamin [VerfasserIn]; Mikut, Ralf [VerfasserIn]; Hagenmeyer, Veit [VerfasserIn]; Neumann, Oliver [VerfasserIn]
  • Erschienen: KIT Scientific Publishing, 2022-12-23
  • Sprache: Englisch
  • DOI: https://doi.org/10.5445/IR/1000154155
  • ISBN: 978-3-7315-1239-4
  • Schlagwörter: DATA processing & computer science
  • Entstehung:
  • Anmerkungen: Diese Datenquelle enthält auch Bestandsnachweise, die nicht zu einem Volltext führen.
  • Beschreibung: Accurate forecasts of the electrical load are needed to stabilize the electrical grid and maximize the use of renewable energies. Many good forecasting methods exist, including neural networks, and we compare them to the recently developed Transformers, which are the state-of-the-art machine learning technique for many sequence-related tasks. We apply different types of Transformers, namely the Time-Series Transformer, the Convolutional Self-Attention Transformer and the Informer, to electrical load data from Baden-Württemberg. Our results show that the Transformes give up to 11% better forecasts than multi-layer perceptrons for long prediction horizons. Furthermore, we analyze the Transformers’ attention scores to get insights into the model.
  • Zugangsstatus: Freier Zugang