Proceedings of the 2015 SIAM International Conference on Data Mining 2015
DOI: 10.1137/1.9781611974010.35
|View full text |Cite
|
Sign up to set email alerts
|

Shapelet Ensemble for Multi-dimensional Time Series

Abstract: Time series shapelets are small subsequences that maximally differentiate classes of time series. Since the inception of shapelets, researchers have used shapelets for various data domains including anthropology and health care, and in the process suggested many efficient techniques for shapelet discovery. However, multi-dimensional time series data poses unique challenges to shapelet discovery that are yet to be solved.We show that an ensemble of shapelet-based decision trees on individual dimensions works be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
16
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(17 citation statements)
references
References 11 publications
1
16
0
Order By: Relevance
“…The second solution is to generate an ensemble of shapelet-1060 based decision trees for complex activity recognition. Cetin et al [76] proposed a novel ensembling technique for finding the nearest neighbors of the candidate shapelets efficiently. They claimed that their technique can efficiently discover shapelets on datasets with multi-dimensional and long time series, which 1065 will be useful in our activity recognition scenarios involving multi-source sensed data (e.g.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The second solution is to generate an ensemble of shapelet-1060 based decision trees for complex activity recognition. Cetin et al [76] proposed a novel ensembling technique for finding the nearest neighbors of the candidate shapelets efficiently. They claimed that their technique can efficiently discover shapelets on datasets with multi-dimensional and long time series, which 1065 will be useful in our activity recognition scenarios involving multi-source sensed data (e.g.…”
Section: Resultsmentioning
confidence: 99%
“…Many optimization techniques [75, 67,76] can provide a near-optimal solution in reasonable time, including metaheuristic algorithms or nature inspired artificial techniques. Re-795 gardless of the method used for training shapelets, they are computed on the server.…”
Section: Shapelet Trainingmentioning
confidence: 99%
“…The algorithm employs the Fast Shapelet selection approach for extracting the most informative shapelets per dimension. In a similar manner, a shapelet tree is built from each time series dimension using several additional techniques for providing search speedups [5]. Moreover, various voting approaches are evaluated for providing the final classification label, demonstrating that one shapelet tree per dimension outperforms shapelets defined over multiple dimensions [5].…”
Section: Related Workmentioning
confidence: 99%
“…In a similar manner, a shapelet tree is built from each time series dimension using several additional techniques for providing search speedups [5]. Moreover, various voting approaches are evaluated for providing the final classification label, demonstrating that one shapelet tree per dimension outperforms shapelets defined over multiple dimensions [5]. More recently, the generalized random shapelet forest has been proposed for univariate and multivariate time series classification, by expanding the idea of random shapelet trees and randomly selecting shapelet features per dimension [20].…”
Section: Related Workmentioning
confidence: 99%
“…There are a few approaches (such as [2,3,8]), which propose extensions of the univariate shapelet extraction method for multivariate use cases, but all of them make the assumption that shapelets can be extracted from different sensors independently of each other. This assumption does not hold for real-world oilfield data where one sensor may affect the reading of other sensors (e.g.…”
Section: Introductionmentioning
confidence: 99%