Top-Oil Temperature Prediction of Power Transformer Based on Long Short-Term Memory Neural Network with Self-Attention Mechanism Optimized by Improved Whale Optimization Algorithm

Autor: Dexu Zou, He Xu, Hao Quan, Jianhua Yin, Qingjun Peng, Shan Wang, Weiju Dai, Zhihu Hong
Jazyk: angličtina
Rok vydání: 2024
Předmět:
Zdroj: Symmetry, Vol 16, Iss 10, p 1382 (2024)
Druh dokumentu: article
ISSN: 2073-8994
DOI: 10.3390/sym16101382
Popis: The operational stability of the power transformer is essential for maintaining the symmetry, balance, and security of power systems. Once the power transformer fails, it will lead to heightened instability within grid operations. Accurate prediction of oil temperature is crucial for efficient transformer operation. To address challenges such as the difficulty in selecting model hyperparameters and incomplete consideration of temporal information in transformer oil temperature prediction, a novel model is constructed based on the improved whale optimization algorithm (IWOA) and long short-term memory (LSTM) neural network with self-attention (SA) mechanism. To incorporate holistic and local information, the SA is integrated with the LSTM model. Furthermore, the IWOA is employed in the optimization of the hyper-parameters for the LSTM-SA model. The standard IWOA is improved by incorporating adaptive parameters, thresholds, and a Latin hypercube sampling initialization strategy. The proposed method was applied and tested using real operational data from two transformers within a practical power grid. The results of the single-step prediction experiments demonstrate that the proposed method significantly improves the accuracy of oil temperature prediction for power transformers, with enhancements ranging from 1.06% to 18.85% compared to benchmark models. Additionally, the proposed model performs effectively across various prediction steps, consistently outperforming benchmark models.
Databáze: Directory of Open Access Journals
Nepřihlášeným uživatelům se plný text nezobrazuje