Traditional deep convolutional neural networks have achieved excellent performance on various machine learning tasks. Still, they perform poorly in continuous data stream environments, where models trained on new datasets often suffer from a significant drop in performance on old datasets, a phenomenon known as ''catastrophic forgetting.'' Incremental learning can help solve the ''catastrophic forgetting'' problem in deep learning by learning new knowledge while retaining what has already been learned. In practice, incremental learning algorithms usually need to be deployed on edge devices with limited memory and restricted access to training data, facing the problems of high model complexity and imbalance between old and new categories of data. We propose an Adaptive Threshold Hierarchical Incremental Learning (ATHIL) method to address the above problems. Our proposed method does not require additional data and model storage space during the training process, combines local weight discrete coefficient thresholding and the mean nearest neighbor principle, uses a sparse matrix hierarchical masking network, and flexibly adjusts the network structure according to different tasks to achieve learning multiple image classification tasks in a single network. The experimental results show that the performance of the proposed method significantly outperforms existing methods on fine-grained classification datasets under three evaluation metrics.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.