JOURNAL ARTICLE

Continual Learning Using Bayesian Neural Networks

Honglin LiPayam BarnaghiShirin EnshaeifarFrieder Ganz

Year: 2020 Journal:   IEEE Transactions on Neural Networks and Learning Systems Vol: 32 (9)Pages: 4243-4252   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Continual learning models allow them to learn and adapt to new changes and tasks over time. However, in continual and sequential learning scenarios, in which the models are trained using different data with various distributions, neural networks (NNs) tend to forget the previously learned knowledge. This phenomenon is often referred to as catastrophic forgetting. The catastrophic forgetting is an inevitable problem in continual learning models for dynamic environments. To address this issue, we propose a method, called continual Bayesian learning networks (CBLNs), which enables the networks to allocate additional resources to adapt to new tasks without forgetting the previously learned tasks. Using a Bayesian NN, CBLN maintains a mixture of Gaussian posterior distributions that are associated with different tasks. The proposed method tries to optimize the number of resources that are needed to learn each task and avoids an exponential increase in the number of resources that are involved in learning multiple tasks. The proposed method does not need to access the past training data and can choose suitable weights to classify the data points during the test time automatically based on an uncertainty criterion. We have evaluated the method on the MNIST and UCR time-series data sets. The evaluation results show that the method can address the catastrophic forgetting problem at a promising rate compared to the state-of-the-art models.

Keywords:
Forgetting Computer science Artificial intelligence Machine learning MNIST database Artificial neural network Task (project management) Bayesian probability Bayesian network Engineering

Metrics

45
Cited By
4.55
FWCI (Field Weighted Citation Impact)
54
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
Multimodal Machine Learning Applications
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Data Stream Mining Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Bayesian continual learning via spiking neural networks

Nicolas SkatchkovskyHyeryung JangOsvaldo Simeone

Journal:   Frontiers in Computational Neuroscience Year: 2022 Vol: 16 Pages: 1037976-1037976
DISSERTATION

Probabilistic Continual Learning using Neural Networks

Swaroop, Siddharth

University:   Apollo (University of Cambridge) Year: 2022
JOURNAL ARTICLE

Learning from the Past: Continual Meta-Learning with Bayesian Graph Neural Networks

Yadan LuoZi HuangZheng ZhangZiwei WangMahsa BaktashmotlaghYang Yang

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2020 Vol: 34 (04)Pages: 5021-5028
© 2026 ScienceGate Book Chapters — All rights reserved.