2021 IEEE Congress on Evolutionary Computation (CEC) 2021
DOI: 10.1109/cec45853.2021.9504721
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Task Learning for Multi-Objective Evolutionary Neural Architecture Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
58
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 21 publications
(58 citation statements)
references
References 17 publications
0
58
0
Order By: Relevance
“…Population-based MOO methods [26] mainly include dominance-based [27], [28], decomposition-based [29], [30], indicator-based [31], [32], hybrid-based [33], [34], and modelbased methods [35]- [37]. Due to their easy scalability and gradient-free properties, these population-based methods are widely used in various machine learning problems, such as neuroevolution [38], [39], NAS [1], [40]- [42], feature selection [43], [44], reinforcement learning [45], federated learning [46], [47], MTL [48], and fairness learning [49]. In addition, many surrogate-assisted multi-objective evolutionary algorithms have been proposed to solve those machine learning systems with expensive optimization objectives [50]- [52].…”
Section: Moomentioning
confidence: 99%
“…Population-based MOO methods [26] mainly include dominance-based [27], [28], decomposition-based [29], [30], indicator-based [31], [32], hybrid-based [33], [34], and modelbased methods [35]- [37]. Due to their easy scalability and gradient-free properties, these population-based methods are widely used in various machine learning problems, such as neuroevolution [38], [39], NAS [1], [40]- [42], feature selection [43], [44], reinforcement learning [45], federated learning [46], [47], MTL [48], and fairness learning [49]. In addition, many surrogate-assisted multi-objective evolutionary algorithms have been proposed to solve those machine learning systems with expensive optimization objectives [50]- [52].…”
Section: Moomentioning
confidence: 99%
“…2). On the other hand, neural architecture search (NAS) studies up until recently have focused mostly on either image classification problems [1,7,29,33,39,62] or learning tasks in isolation [19,34,54,67]. Few have explored architecture search for joint training of dense prediction tasks.…”
Section: Background and Related Workmentioning
confidence: 99%
“…latency, memory, and energy, alongside prediction accuracy, to guide the search. However, current studies often focus on image classification [1,7,29,33,39,62] or learning tasks in isolation [54,67]. However, performing multiple dense prediction tasks simultaneously can have significant benefits for both inference speed and accuracy since tasks can leverage each other's training signals as inductive biases to improve their own learning and the model's generalization [8].…”
Section: Background and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Deep neural networks (DNNs) are becoming ubiquitous across a plethora of intelligent embedded applications such as virtual reality (VR) [1] and object detection/tracking [2], enabling entirely new ondevice experiences [3]. Nonetheless, given that the network design space is tremendously large [4,5], manually designing competitive DNNs requires considerable human efforts to determine the optimal network configuration. To address this, neural architecture search (NAS) [6] has recently flourished, which is dedicated to automating the design of top-performing DNNs.…”
Section: Introductionmentioning
confidence: 99%