Sie können Bookmarks mittels Listen verwalten, loggen Sie sich dafür bitte in Ihr SLUB Benutzerkonto ein.
Medientyp:
E-Artikel
Titel:
Developing Feedforward Neural Networks as Benchmark for Load Forecasting: Methodology Presentation and Application to Hospital Heat Load Forecasting
Beteiligte:
Stienecker, Malte;
Hagemeier, Anne
Erschienen:
MDPI AG, 2023
Erschienen in:
Energies, 16 (2023) 4, Seite 2026
Sprache:
Englisch
DOI:
10.3390/en16042026
ISSN:
1996-1073
Entstehung:
Anmerkungen:
Beschreibung:
For load forecasting, numerous machine learning (ML) approaches have been published. Besides fully connected feedforward neural networks (FFNNs), also called multilayer perceptron, more advanced ML approaches like deep, recurrent or convolutional neural networks or ensemble methods have been applied. However, evaluating the added benefit by novel approaches is difficult. Statistical or rule-based methods constitute a too low benchmark. FFNNs need extensive tuning due to their manifold design choices. To address this issue, a structured, comprehensible five-step FFNN model creation methodology is presented, which constitutes of initial model creation, internal parameter selection, feature engineering, architecture tuning and final model creation. The methodology is then applied to forecast real world heat load data of a hospital in Germany. The forecast constitutes of 192 values (upcoming 48 h in 15 min resolution) and is composed of a multi-model univariate forecasting strategy, with three test models developed at first. As a result, the test models show great similarities which simplifies creation of the remaining models. A performance increase of up to 18% between initial and final models points out the importance of model tuning. As a conclusion, comprehensible model tuning is vital to use FFNN models as benchmark. The effort needed can be reduced by the experience gained through repeated application of the presented methodology.