2022
DOI: 10.1007/978-3-031-05933-9_12
|View full text |Cite
|
Sign up to set email alerts
|

AutoTransformer: Automatic Transformer Architecture Design for Time Series Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
4
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(4 citation statements)
references
References 21 publications
0
4
0
Order By: Relevance
“…integrates deformable convolutional blocks and online knowledge distillation, as well as a random mask to reduce noise [137]. For each TSC dataset, AutoTransformer searches for the suitable network architecture using the neural architecture search (NAS) algorithm before feeding the output to the multi-headed attention blocks [138].…”
Section: Transformersmentioning
confidence: 99%
“…integrates deformable convolutional blocks and online knowledge distillation, as well as a random mask to reduce noise [137]. For each TSC dataset, AutoTransformer searches for the suitable network architecture using the neural architecture search (NAS) algorithm before feeding the output to the multi-headed attention blocks [138].…”
Section: Transformersmentioning
confidence: 99%
“…In recent years, the Transformer has also shown powerful ability to capture long-term dependencies and correlations in time series research. Various Transformer variants have been applied by researchers to solve problems in time series, including classification [22][23] [24], prediction [25][26] [27], and anomaly detection [28][29] [30], among others. In a recent study on classification tasks, Jayant et al [23] proposed a Transformer-based framework for driving behavior classification, analyzing driving behavior through smartphone telematics data belonging to multivariate time series, instead of using convolutional or recurrent network architectures.…”
Section: Transformer For Time Seriesmentioning
confidence: 99%
“…In a recent study on classification tasks, Jayant et al [23] proposed a Transformer-based framework for driving behavior classification, analyzing driving behavior through smartphone telematics data belonging to multivariate time series, instead of using convolutional or recurrent network architectures. Ren et al [24] proposed a data-driven design method for time series classification network architecture called AutoTransformer, which can find network structures suitable for different datasets from a novel search space.…”
Section: Transformer For Time Seriesmentioning
confidence: 99%
“…AutoTransformer [148] first designs a customized search space for Time Series Classification (TSC) task and applies Gradient-based Differentiable Architecture Sampler (GDAS) [149], an improved DARTS method, to search for an efficient Transformer. The search space incorporates several structures and operations within the Transformer backbone, which can extract global and local features from the time series.…”
Section: H Autotransformermentioning
confidence: 99%