2023
DOI: 10.1109/tii.2022.3188839
|View full text |Cite
|
Sign up to set email alerts
|

A Novel Embedded Discretization-Based Deep Learning Architecture for Multivariate Time Series Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 29 publications
0
2
0
Order By: Relevance
“…In 2023, M. H. Tahan et al [1] have incorporated temporal discretization as a preprocessing step for time series data, seamlessly integrated it within DNN. These novel models were structured in two main segments: model training and temporal discretization.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…In 2023, M. H. Tahan et al [1] have incorporated temporal discretization as a preprocessing step for time series data, seamlessly integrated it within DNN. These novel models were structured in two main segments: model training and temporal discretization.…”
Section: Related Workmentioning
confidence: 99%
“…Time series classification involves assigning labels or categories to sequences of data points ordered over time, and it requires specialized techniques to handle the temporal nature of the data effectively. Several traditional techniques like DNN [1] is especially valuable when dealing with complex temporal patterns and time series data often exhibit hierarchical patterns at different time scales but it can be crucial for achieving good performance in time series classification tasks. DTGNN [2] is designed to specifically handle time series data, it might offer advantages in capturing temporal dependencies and dynamics in the data, which is crucial for accurate time series classification but the adoption of a new architecture can be a limitation if it's not well-supported or integrated with existing machine learning environments.…”
Section: Research Gaps and Challengesmentioning
confidence: 99%