ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9414142
|View full text |Cite
|
Sign up to set email alerts
|

Tabular Transformers for Modeling Multivariate Time Series

Abstract: Tabular datasets are ubiquitous in data science applications. Given their importance, it seems natural to apply state-ofthe-art deep learning algorithms in order to fully unlock their potential. Here we propose neural network models that represent tabular time series that can optionally leverage their hierarchical structure. This results in two architectures for tabular time series: one for learning representations that is analogous to BERT and can be pre-trained end-to-end and used in downstream tasks, and on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
40
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 53 publications
(40 citation statements)
references
References 9 publications
0
40
0
Order By: Relevance
“…TABERT [48], a more elaborate neural approach inspired by the large language transformer model BERT [9], is trained on semi-structured test data to perform language-specific tasks. Several other studies utilize tabular data, but their problem settings are outside of our scope [3,21,31,32,35].…”
Section: Related Workmentioning
confidence: 99%
“…TABERT [48], a more elaborate neural approach inspired by the large language transformer model BERT [9], is trained on semi-structured test data to perform language-specific tasks. Several other studies utilize tabular data, but their problem settings are outside of our scope [3,21,31,32,35].…”
Section: Related Workmentioning
confidence: 99%
“…Additional embedding techniques and manipulated representations of the data (e.g. TaBERT [26], TabFormer [27]) can be incorporated as pre-processing steps.…”
Section: Performance Comparisonsmentioning
confidence: 99%
“…These methods can often be made to accommodate categorical data through one-hot encoding, but in the authors' experience quality of models in this family rapidly degrades as the fraction of the series' variables that are categorical increases. The field in which anomaly detection in categorical time series is most developed in is intrusion detection in network security and fraud detection [4,9]. The authors in [9] utilize a transformer architecture for fraud detection inspired by an analogy between finite sequences of discrete variables and words in the domain of Natural Language Processing (NLP).…”
Section: Related Workmentioning
confidence: 99%
“…The field in which anomaly detection in categorical time series is most developed in is intrusion detection in network security and fraud detection [4,9]. The authors in [9] utilize a transformer architecture for fraud detection inspired by an analogy between finite sequences of discrete variables and words in the domain of Natural Language Processing (NLP).…”
Section: Related Workmentioning
confidence: 99%