Yao LiuBojian ChenDong WangLin KongJuanjuan ShiChangqing Shen
Deep learning (DL)-based fault diagnosis models need to collect sufficient fault information for each fault type to ensure high-precision diagnosis. Some unexpected and new fault types will inevitably appear in actual conditions, which are called incremental fault types or class increment. Traditional DL models require the costly collection of all known data for retraining; while the use of new fault data may lead to catastrophic forgetting of old tasks. To solve the problem of bearing diagnosis with incremental fault types, a lifelong learning method based on generative feature replay (LLMGFR) is proposed in this study. A feature distillation method is put forward in this method to avoid forgetting in the feature extractor. The generator is trained to produce old task features. The generated features are mixed with real features of the current task to solve the imbalance problem and catastrophic forgetting of the classifier effectively. According to incremental fault diagnosis cases, LLMGFR can learn constantly and adaptively in dynamic environments with incremental fault types.
Bojian ChenChuancang DingXingxing JiangJuanjuan ShiWeiguo HuangChangqing Shen
Zhenzhong HeChangqing ShenBojian ChenJuanjuan ShiWeiguo HuangZhongkui ZhuDong Wang
Shijun XieChangqing ShenDong WangJuanjuan ShiWeiguo HuangZhongkui Zhu
Bojian ChenChangqing ShenDong WangLin KongLiang ChenZhongkui Zhu
Changqing ShenShijun XieJuanjuan ShiH. J. YangWeiguo HuangZhongkui Zhu