2021
DOI: 10.48550/arxiv.2105.12374
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Continual Learning for Real-World Autonomous Systems: Algorithms, Challenges and Frameworks

Abstract: Continual learning is essential for all real-world applications, as frozen pre-trained models cannot effectively deal with non-stationary data distributions. The purpose of this study is to review the state-of-the-art methods that allow continuous learning of computational models over time. We primarily focus on the learning algorithms that perform continuous learning in an online fashion from considerably large (or infinite) sequential data and require substantially low computational and memory resources. We … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 62 publications
0
4
0
Order By: Relevance
“…• Regularization-based Approach: This method consolidates past knowledge by incorporating additional loss terms that reduce the rate of learning for important weights used in previously learned tasks. By doing so, it minimizes the risk of new task information significantly altering the previously acquired weights (Shaheen et al, 2022). An example of this approach is Elastic Weight Consolidation (EWC), which penalizes weight changes based on task importance, regularizing model parameters and preventing catastrophic forgetting of previous experiences (Febrinanto et al, 2022).…”
Section: Lifelong Learningmentioning
confidence: 99%
“…• Regularization-based Approach: This method consolidates past knowledge by incorporating additional loss terms that reduce the rate of learning for important weights used in previously learned tasks. By doing so, it minimizes the risk of new task information significantly altering the previously acquired weights (Shaheen et al, 2022). An example of this approach is Elastic Weight Consolidation (EWC), which penalizes weight changes based on task importance, regularizing model parameters and preventing catastrophic forgetting of previous experiences (Febrinanto et al, 2022).…”
Section: Lifelong Learningmentioning
confidence: 99%
“…Shin et al [35] proposed to train a generative model on the old data distribution and use it to generate fake samples that help in mitigating the forgetting of old classes. Although having the downside of the model's performance being upper-bounded by the joint-training in all tasks [20], the replay family has been the most consistently used strategy in real-world applications of CL [6,36].…”
Section: Parameter Isolation Techniquesmentioning
confidence: 99%
“…Contrastively, the CIOD paradigm needs a more specific treatment due to its inherent challenges and complexity. The task of incrementally adding classes to a trained detector is considered of substantial importance for several applications that deal with memory and computational constraints [6]. The main issue that makes detection a more difficult task than only classification for class-incremental scenarios is that the same image can have several instances of different objects that are unknown apriori.…”
Section: Continual Learning For Object Detectionmentioning
confidence: 99%
See 1 more Smart Citation
“…Several applications that deal with streams of images can benefit from having models that can naturally deal with changing and incremental contexts, such as autonomous cars, UAVs, and house robots (SHAHEEN et al, 2021). Applications of CL for object detection are the main focus of this Ph.D., which will be discussed more deeply in the following sections.…”
Section: Phd Thesis Contextmentioning
confidence: 99%