Yu HeJianxin LiYangqiu SongMutian HeHao Peng
Traditional text classification algorithms are based on the assumption that data are independent and identically distributed. However, in most non-stationary scenarios, data may change smoothly due to long-term evolution and short-term fluctuation, which raises new challenges to traditional methods. In this paper, we present the first attempt to explore evolutionary neural network models for time-evolving text classification. We first introduce a simple way to extend arbitrary neural networks to evolutionary learning by using a temporal smoothness framework, and then propose a diachronic propagation framework to incorporate the historical impact into currently learned features through diachronic connections. Experiments on real-world news data demonstrate that our approaches greatly and consistently outperform traditional neural network models in both accuracy and stability.
Wang, BinXue, BingYanan SunMengjie Zhang
Johnson KolluriV. Chandra Shekhar RaoGouthami VelakantiSiripuri KiranSumukham SravanthiS. Venkatramulu