2021
DOI: 10.1609/aaai.v35i10.17070
|View full text |Cite
|
Sign up to set email alerts
|

Learning Representations for Incomplete Time Series Clustering

Abstract: Time-series clustering is an essential unsupervised technique for data analysis, applied to many real-world fields, such as medical analysis and DNA microarray. Existing clustering methods are usually based on the assumption that the data is complete. However, time series in real-world applications often contain missing values. Traditional strategy (imputing first and then clustering) does not optimize the imputation and clustering process as a whole, which not only makes per- formance dependent on the combina… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
49
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 36 publications
(49 citation statements)
references
References 26 publications
0
49
0
Order By: Relevance
“…Deep temporal clustering (DTC) [30] naturally integrates an autoencoder network for dimensionality reduction and a novel temporal clustering layer for new time series representation clustering into a single end-to-end learning framework without using labels. DTCR [29] proposes a seq2seq autoencoder representation learning model, integrating reconstruction task (for autoencoder), K-means task (for hidden representation), and classification task (to enhance the ability of encoder). After learning the autoencoder, a classical method (e.g., Kmeans) is applied to the hidden representation.…”
Section: B Autoencoder-based Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…Deep temporal clustering (DTC) [30] naturally integrates an autoencoder network for dimensionality reduction and a novel temporal clustering layer for new time series representation clustering into a single end-to-end learning framework without using labels. DTCR [29] proposes a seq2seq autoencoder representation learning model, integrating reconstruction task (for autoencoder), K-means task (for hidden representation), and classification task (to enhance the ability of encoder). After learning the autoencoder, a classical method (e.g., Kmeans) is applied to the hidden representation.…”
Section: B Autoencoder-based Methodsmentioning
confidence: 99%
“…In this section, we first present comprehensive experiments conducted AUTOSHAPE with 15 related methods on the UCR (univariate) datasets in Section IV-C. We then report the results of AUTOSHAPE on the UEA (multivariate) datasets with 5 related methods particularly in Section IV-D. The methods compared with AUTOSHAPE are the same in STCN [28], DTCR [29], and USSL [48].…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations