Fault diagnosis of mind-used transformer based on stacked sparse auto-encoder

Autor: XU Qianwen, JI Xingquan, ZHANG Yuzhen, LI Jun, YU Yongjin
Jazyk: čínština
Rok vydání: 2018
Předmět:
Zdroj: Gong-kuang zidonghua, Vol 44, Iss 10, Pp 33-37 (2018)
Druh dokumentu: article
ISSN: 1671-251X
1671-251x
DOI: 10.13272/j.issn.1671-251x.2018040092
Popis: In view of application of deep learning to transformer fault diagnosis had a good fault diagnosis effect, a fault diagnosis method of mind-used transformer based on stacked sparse auto-encoder was proposed. Sparse auto-encoder is constructed by introducing sparse item constraint in hidden layer of auto-encoder, then the multiple sparse auto-encoders are stacked to form stacked sparse auto-encoder, and Softmax classifier is used as output layer to establish mine-used transformer fault diagnosis model. A large number of unlabeled samples are used to carry out unsupervised pre-training for the model, and the model parameters are optimized through supervised fine-tuning. The example analysis results show that stacked sparse auto-encoder is more accurate than stack auto-encoder in application of fault diagnosis of mind-used transformer.
Databáze: Directory of Open Access Journals