2020
DOI: 10.48550/arxiv.2006.04027
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Efficient Architecture Search for Continual Learning

Abstract: Continual learning with neural networks is an important learning framework in AI that aims to learn a sequence of tasks well. However, it is often confronted with three challenges: (1) overcome the catastrophic forgetting problem, (2) adapt the current network to new tasks, and meanwhile (3) control its model complexity. To reach these goals, we propose a novel approach named as Continual Learning with Efficient Architecture Search, or CLEAS in short. CLEAS works closely with neural architecture search (NAS) w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 17 publications
0
3
0
Order By: Relevance
“…Though, it was believed that evaluating the performance of NAS methods is often hard [389], [390], [391]. Different settings beyond supervised learning have been investigated in NAS, including like semi-supervised learning [392], self-supervised learning [131], unsupervised learning [115], [377], incremental learning [361], [393], federated learning [394], [395], etc., showing the promising transferability of NAS methods. Last but not least, there are several toolkits for AutoML [17], [396], [397], [398], [399] that can facilitate the reproducibility of NAS methods.…”
Section: Discussionmentioning
confidence: 99%
“…Though, it was believed that evaluating the performance of NAS methods is often hard [389], [390], [391]. Different settings beyond supervised learning have been investigated in NAS, including like semi-supervised learning [392], self-supervised learning [131], unsupervised learning [115], [377], incremental learning [361], [393], federated learning [394], [395], etc., showing the promising transferability of NAS methods. Last but not least, there are several toolkits for AutoML [17], [396], [397], [398], [399] that can facilitate the reproducibility of NAS methods.…”
Section: Discussionmentioning
confidence: 99%
“…The third main approach to CL is the architecture-based approach (Rusu et al, 2016b;Fernando et al, 2017;Kaplanis et al, 2018;Xu & Zhu, 2018;Yoon et al, 2018;Du et al, 2019;Gigante et al, 2019;He et al, 2019;Li et al, 2019a;Ostapenko et al, 2019;Xu et al, 2019;Gao et al, 2020;Qin et al, 2021;Mirzadeh et al, 2022;Morawiecki et al, 2022) which often balances stability and learnability via a rigid division of the architecture into shared and task-specific components.…”
Section: Related Workmentioning
confidence: 99%
“…A CL architecture is established in PathNet (Fernando et al, 2017) based on agents selected by a genetic algorithm. Neural Architecture Search (NAS) is adopted by Gao et al (2020) where reinforcement learning (RL) techniques are utilized to search for the best neural architecture for each task. Another RL-based CL framework is the one introduced by Kaplanis et al (2018) where catastrophic forgetting is mitigated via RL agents with a synaptic model inspired by neuroscience.…”
Section: Related Workmentioning
confidence: 99%