2021
DOI: 10.1609/aaai.v35i8.16846
|View full text |Cite
|
Sign up to set email alerts
|

Time Series Domain Adaptation via Sparse Associative Structure Alignment

Abstract: Domain adaptation on time series data is an important but challenging task. Most of the existing works in this area are based on the learning of the domain-invariant representation of the data with the help of restrictions like MMD. However, such extraction of the domain-invariant representation is a non-trivial task for time series data, due to the complex dependence among the timestamps. In detail, in the fully dependent time series, a small change of the time lags or the offsets may lead to difficulty in th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 45 publications
(10 citation statements)
references
References 24 publications
(29 reference statements)
0
10
0
Order By: Relevance
“…CoDATS (Wilson et al, 2020) builds upon VRADA but uses a convolutional neural network for the feature extractor. 2) Statistical divergence: SASA (Cai et al, 2021) aligns the condition distribution of the time series data by minimizing the discrepancy of the associative structure of time series variables between domains. AdvSKM (Liu and Xue, 2021a) and (Ott et al, 2022) are metric-based methods that align two domains by considering statistic divergence.…”
Section: Related Workmentioning
confidence: 99%
“…CoDATS (Wilson et al, 2020) builds upon VRADA but uses a convolutional neural network for the feature extractor. 2) Statistical divergence: SASA (Cai et al, 2021) aligns the condition distribution of the time series data by minimizing the discrepancy of the associative structure of time series variables between domains. AdvSKM (Liu and Xue, 2021a) and (Ott et al, 2022) are metric-based methods that align two domains by considering statistic divergence.…”
Section: Related Workmentioning
confidence: 99%
“…For instance, AdvSKM leverages the maximum mean discrepancy (MMD) distance in combination with a hybrid spectral kernel to consider temporal dependencies during domain adaptation [22]. Another example is SASA, which learns the association structure of time series data to align the source and target domains [3]. On the contrary, adversarial-based methods use adversarial training to mitigate the distribution shift between the source and target domains.…”
Section: Related Work 21 Time Series Domain Adaptationmentioning
confidence: 99%
“…To address this issue, unsupervised domain adaptation (UDA) has gained traction as a way to leverage pre-labeled source data for training on unlabeled target data, while also addressing the distribution shift between the two domains [32]. There is a growing interest in applying UDA to time series data [26], with existing methods seeking to minimize statistical distance across the source and target features [3,38] or using adversarial training to find domain invariant features [11,25,33,34]. However, these approaches require access to the source data during the adaptation process, which may not be always possible, due to data privacy regulations.…”
Section: Introductionmentioning
confidence: 99%
“…For instance, based on DANN, recurrent domain adversarial neural network (R-DANN) and variational recurrent adversarial deep domain adaptation (VRADA) (Purushotham et al, 2017) are proposed by exploiting the long short-term memory (LSTM) network (Hochreiter and Schmidhuber, 1997) and variational RNN (Chung et al, 2015) as a feature extractor, respectively. More models, such as a sparse associative structure alignment (SASA) model (Cai et al, 2021) and a convolutional deep domain adaptation model for time series data (CoDATS) , are developed to improve time series UDA performance.…”
Section: Time Series Data Analysismentioning
confidence: 99%