2022
DOI: 10.1007/s00778-022-00752-2
|View full text |Cite
|
Sign up to set email alerts
|

VolcanoML: speeding up end-to-end AutoML via scalable search space decomposition

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
12
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3

Relationship

2
7

Authors

Journals

citations
Cited by 18 publications
(12 citation statements)
references
References 50 publications
0
12
0
Order By: Relevance
“…Further, there is a large effort in the data management community to speed up AutoML systems. For instance, Li et al propose to leverage search space decomposition [29]. Yakovlev et al propose to leverage proxy models, iteration-free optimization, and adaptive data reduction to accelerate hyperparameter optimization [54].…”
Section: Related Workmentioning
confidence: 99%
“…Further, there is a large effort in the data management community to speed up AutoML systems. For instance, Li et al propose to leverage search space decomposition [29]. Yakovlev et al propose to leverage proxy models, iteration-free optimization, and adaptive data reduction to accelerate hyperparameter optimization [54].…”
Section: Related Workmentioning
confidence: 99%
“…There are ongoing efforts to increase the performance of AutoML tools through better NAS approach [45,62,80,86], search for hyperparameters and loss methods [72,74,74], automating machine learning via keeping human in the loop [71]. Truong et al [104] evaluated few popular AutoML tools on their abilities to automate ML pipeline.…”
Section: Related Workmentioning
confidence: 99%
“…AutoCTS is a gradient-based method due to its high efficiency. Despite of great success in computer vision [3,37], natural language processing [39], and AutoML systems [27,54], little effort has been devoted to time series forecasting. AutoST [25] is proposed for spatio-temporal prediction, where the time series are from a uniform grid.…”
Section: Related Workmentioning
confidence: 99%