Cuong V. NguyenYingzhen LiThang D. BuiRichard E. Turner
This paper develops variational continual learning (VCL), a simple but general framework for continual learning that fuses online variational inference (VI) and recent advances in Monte Carlo VI for neural networks. The framework can suc- cessfully train both deep discriminative models and deep generative models in complex continual learning settings where existing tasks evolve over time and en- tirely new tasks emerge. Experimental results show that VCL outperforms state- of-the-art continual learning methods on a variety of tasks, avoiding catastrophic forgetting in a fully automatic way.
Christopher F. AngeliniNidhal BouaynayaGhulam Rasool
Liu GuimengYang GuoCheryl Wong Sze YinPonnuthurai Nagartnam SuganathanSavitha Ramasamy
Nam Le HaiT.T. NguyenLinh Ngo VanThien Huu NguyenKhoat Than
Xiaorong LiShipeng WangJian SunZongben Xu