2020
DOI: 10.48550/arxiv.2011.13548
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Self-Supervised Time Series Representation Learning by Inter-Intra Relational Reasoning

Haoyi Fan,
Fengbin Zhang,
Yue Gao

Abstract: Self-supervised learning achieves superior performance in many domains by extracting useful representations from the unlabeled data. However, most of traditional self-supervised methods mainly focus on exploring the inter-sample structure while less efforts have been concentrated on the underlying intra-temporal structure, which is important for time series data. In this paper, we present Self-Time: a general Self-supervised Time series representation learning framework, by exploring the inter-sample relation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…BIMO's goal is to be easily used in downstream tasks by discovering the most significant modalities for representation learning in all domains of time-series data. This study was inspired by existing work on SOTA contrastive learning-based unsupervised learning methods [15,23,27].…”
Section: Methodsmentioning
confidence: 99%
“…BIMO's goal is to be easily used in downstream tasks by discovering the most significant modalities for representation learning in all domains of time-series data. This study was inspired by existing work on SOTA contrastive learning-based unsupervised learning methods [15,23,27].…”
Section: Methodsmentioning
confidence: 99%
“…To bridge this gap, Paparrizos et al introduced GRAIL [22], a comprehensive framework dedicated to discerning compact time-series representations that maintained the intricacies of a user-defined comparison function. Moving forward, Fan et al revealed SelfTime [23], a versatile self-supervised time-series representation of learning methodology. It honed in on the intersample and intratemporal relationships within time series to decode the latent structural features in unlabeled data.…”
Section: Time-series Representation Learningmentioning
confidence: 99%
“…[12] employs the idea of contrastive learning where two positive samples are generated by weak augmentation and strong augmentation to predict each other while the similarity among different augmentations of the same sample is maximized. [13] is designed for univariate time series. It samples several segments of the time series and labels each segment pairs according to their relative distance in the origin series.…”
Section: Pretext Tasks For Time Seriesmentioning
confidence: 99%