Zero Initialization of modified Gated Recurrent Encoder-Decoder Network for Short Term Load Forecasting
Autor: | Vedanshu, Tripathi, M M |
---|---|
Rok vydání: | 2018 |
Předmět: | |
Druh dokumentu: | Working Paper |
Popis: | Single layer Feedforward Neural Network(FNN) is used many a time as a last layer in models such as seq2seq or could be a simple RNN network. The importance of such layer is to transform the output to our required dimensions. When it comes to weights and biases initialization, there is no such specific technique that could speed up the learning process. We could depend on deep network initialization techniques such as Xavier or He initialization. But such initialization fails to show much improvement in learning speed or accuracy. In this paper we propose Zero Initialization (ZI) for weights of a single layer network. We first test this technique with on a simple RNN network and compare the results against Xavier, He and Identity initialization. As a final test we implement it on a seq2seq network. It was found that ZI considerably reduces the number of epochs used and improve the accuracy. The developed model has been applied for short-term load forecasting using the load data of Australian Energy Market. The model is able to forecast the day ahead load accurately with error of 0.94%. |
Databáze: | arXiv |
Externí odkaz: |