服务号

订阅号

Manuscript details

Current location:Home >Manuscript details

Ensemble Learning Framework and Knowledge Distillation Technology and Its Application in Transformer Fault Identification

Release date:2024-07-18  Number of views:471   Amount of downloads:207   DOI:10.19457/j.1001-2095.dqcd25035

      Abstract:Accurately and quickly identifying the fault types of traction transformers is a key technology for

intelligent operation and maintenance. Aiming at the problems of single model deviation in the current traditional

algorithm and the constraints between the iteration rate of complex models and the deployment of computing

resources,a traction transformer fault diagnosis model based on the Stacking ensemble learning framework was

proposed,and incorporated knowledge distillation technology to compress model iteration time to improve the

computational performance of the model. First,an evaluation feature vector composed of gas indicators in

transformer oil was constructed,and then the single Bagging and Boosting framework algorithm were combined

based on the Stacking integrated learning framework,and knowledge distillation technology was incorporated to

realize the effective mapping of feature vectors and fault types. The actual generalization effect in the DGA data

sample shows that this method solves the problem of bias and variance in the traditional integrated model,

accelerates the iteration speed of the integrated model,and proves the engineering application value of the model.


      Key words:transformer fault diagnosis;Stacking framework;ensemble learning;knowledge distillation





Copyright Tianjin Electric Research Institute Co., Ltd Jin ICP Bei No. 07001287 Powered by Handynasty

Online illegal and bad information reporting hotline (Hedong District):022-84376127
Report Mailbox:wangzheng@tried.com.cn