2022
DOI: 10.1007/978-3-031-20862-1_34
|View full text |Cite
|
Sign up to set email alerts
|

The Time-Sequence Prediction via Temporal and Contextual Contrastive Representation Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…While the second category consists of classification and clustering that prioritize instance-level information, i.e. coarse-grained information (Eldele et al 2021(Eldele et al , 2022Liu and wei Liu 2022), aiming to infer the target across the entire series. Therefore, when confronted with a task-agnostic pre-training model that lacks prior knowledge or awareness of specific tasks during the pre-training phase, both segment-and instance-level information become indispensable for achieving effective universal time series representation learning.…”
Section: Introductionmentioning
confidence: 99%
“…While the second category consists of classification and clustering that prioritize instance-level information, i.e. coarse-grained information (Eldele et al 2021(Eldele et al , 2022Liu and wei Liu 2022), aiming to infer the target across the entire series. Therefore, when confronted with a task-agnostic pre-training model that lacks prior knowledge or awareness of specific tasks during the pre-training phase, both segment-and instance-level information become indispensable for achieving effective universal time series representation learning.…”
Section: Introductionmentioning
confidence: 99%