2020
DOI: 10.48550/arxiv.2006.13799
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Auto-PyTorch Tabular: Multi-Fidelity MetaLearning for Efficient and Robust AutoDL

Abstract: While early AutoML frameworks focused on optimizing traditional ML pipelines and their hyperparameters, a recent trend in AutoML is to focus on neural architecture search. In this paper, we introduce Auto-PyTorch, which brings the best of these two worlds together by jointly and robustly optimizing the architecture of networks and the training hyperparameters to enable fully automated deep learning (AutoDL). Auto-PyTorch achieves state-of-the-art performance on several tabular benchmarks by combining multi-fid… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(10 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…Different settings beyond supervised learning have been investigated in NAS, including like semi-supervised learning [392], self-supervised learning [131], unsupervised learning [115], [377], incremental learning [361], [393], federated learning [394], [395], etc., showing the promising transferability of NAS methods. Last but not least, there are several toolkits for AutoML [17], [396], [397], [398], [399] that can facilitate the reproducibility of NAS methods.…”
Section: Discussionmentioning
confidence: 99%
“…Different settings beyond supervised learning have been investigated in NAS, including like semi-supervised learning [392], self-supervised learning [131], unsupervised learning [115], [377], incremental learning [361], [393], federated learning [394], [395], etc., showing the promising transferability of NAS methods. Last but not least, there are several toolkits for AutoML [17], [396], [397], [398], [399] that can facilitate the reproducibility of NAS methods.…”
Section: Discussionmentioning
confidence: 99%
“…For example, TPOT from [29] generates a set of best performing models from Sklearn and XGboost and automatically chose the best subset of models. Moreover, other papers focus on the problem of automated deep learning model selection and optimization [24,43]. Finally, several papers propose various methods of automatic feature generation [29].…”
Section: Related Workmentioning
confidence: 99%
“…The nodes N 1 and N 2 represent dense layers Dense(x, y), where x is the number of neurons and y is the activation function. The nodes SC 2 1 , SC 3 1 , SC 3 2 represent the possible skip-connection nodes, when id R is chosen for each of them. The node N 2 is connected to input node through SC 2 1 .…”
Section: Repeatmentioning
confidence: 99%
“…The nodes SC 2 1 , SC 3 1 , SC 3 2 represent the possible skip-connection nodes, when id R is chosen for each of them. The node N 2 is connected to input node through SC 2 1 . The output node is connected to input and N 1 nodes through SC 3 1 and SC 3 2 , respectively.…”
Section: Repeatmentioning
confidence: 99%
See 1 more Smart Citation