Hyperparameter Optimization of a Parallelized LSTM for Time Series Prediction

Autor: Muhammed Maruf Öztürk
Jazyk: angličtina
Rok vydání: 2023
Předmět:
Zdroj: Vietnam Journal of Computer Science, Vol 10, Iss 03, Pp 303-328 (2023)
Druh dokumentu: article
ISSN: 21968888
2196-8896
2196-8888
DOI: 10.1142/S2196888823500033
Popis: Long Short-Term Memory (LSTM) Neural Network has great potential to predict sequential data. Time series prediction is one of the most popular experimental subjects of LSTM. To this end, various LSTM algorithms have been developed to predict time series data. However, there are a few works considering the hyperparameter optimization of LSTM along with parallelization approaches. To address this problem, a parallelized classic LSTM is proposed to predict time series. In the preprocessing phase, it first replaces missing values with zero and then normalizes the time series matrix. The transposed matrix is divided into training and testing parts. Consequently, a core-based parallelism is established, thereby utilizing forking to split prediction into multiple processes. Derivative-free optimization techniques are also analyzed to find out what sort of hyperparameter optimization technique is much more feasible for a parallelized LSTM. Further, a state-of-the-art comparison is included in the study. Experimental results show that training loss is optimal when using Nelder–Mead. Employing effort-intensive optimization methods such as genetic algorithms results in a remarkable CPU time reduction in parallelized designs according to the results. Last, the proposed algorithm outperformed the comparison methods with respect to the prediction errors.
Databáze: Directory of Open Access Journals