Deep neural networks can be used to perform nonlinear operations at multiple levels, such as a neural network that is composed of many hidden layers. Although deep learning approaches show good results, they have a drawback called catastrophic forgetting, which is a reduction in performance when a new class is added. Incremental learning is a learning method where existing knowledge should be retained even when new data is acquired. It involves learning with multiple batches of training data and the newer learning sessions do not require the data used in the previous iterations. The Bayesian approach to incremental learning uses the concept of the probability distribution of weights. The key idea of Bayes theorem is to find an updated distribution of weights and biases. In the Bayesian framework, the beliefs can be updated iteratively as the new data comes in. Bayesian framework allows to update the beliefs iteratively in real-time as data comes in. The Bayesian model for incremental learning showed an accuracy of 82%. The execution time for the Bayesian model was lesser on GPU (670 s) when compared to CPU (1165 s).
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.