2015 IEEE International Conference on Data Mining 2015
DOI: 10.1109/icdm.2015.87
|View full text |Cite
|
Sign up to set email alerts
|

A Parameter-Free Approach for Mining Robust Sequential Classification Rules

Abstract: Sequential data is generated in many domains of science and technology. Although many studies have been carried out for sequence classification in the past decade, the problem is still a challenge; particularly for pattern-based methods. We identify two important issues related to pattern-based sequence classification which motivate the present work: the curse of parameter tuning and the instability of common interestingness measures. To alleviate these issues, we suggest a new approach and framework for minin… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 15 publications
(16 citation statements)
references
References 28 publications
0
16
0
Order By: Relevance
“…Thus, the probability to draw an uninteresting pattern is still high, and not all local optima may be drawn: There are no guaranties on the diversity of the result set. Recently, the sampling algorithm Misere has been proposed by [24,20,21]. Contrary to the sampling method of Moens and Boley, Misere does not require any probability distribution.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Thus, the probability to draw an uninteresting pattern is still high, and not all local optima may be drawn: There are no guaranties on the diversity of the result set. Recently, the sampling algorithm Misere has been proposed by [24,20,21]. Contrary to the sampling method of Moens and Boley, Misere does not require any probability distribution.…”
Section: Related Workmentioning
confidence: 99%
“…direct-freq-roll-out top-10-mean-reward (7) direct-freq-roll-out random-reward (8) large-freq-roll-out (jumpLength = 10) max-reward (9) large-freq-roll-out (jumpLength = 10) mean-reward (10) large-freq-roll-out (jumpLength = 10) top-2-mean-reward (11) large-freq-roll-out (jumpLength = 10) top-5-mean-reward (12) large-freq-roll-out (jumpLength = 10) top-10-mean-reward (13) large-freq-roll-out (jumpLength = 10) random-reward (14) large-freq-roll-out (jumpLength = 20) max-reward (15) large-freq-roll-out (jumpLength = 20) mean-reward (16) large-freq-roll-out (jumpLength = 20) top-2-mean-reward (17) large-freq-roll-out (jumpLength = 20) top-5-mean-reward (18) large-freq-roll-out (jumpLength = 20) top-10-mean-reward (19) large-freq-roll-out (jumpLength = 20) random-reward (20) large-freq-roll-out (jumpLength = 50) max-reward (21) large-freq-roll-out (jumpLength = 50) mean-reward (22) large-freq-roll-out (jumpLength = 50) top-2-mean-reward (23) large-freq-roll-out (jumpLength = 50) top-5-mean-reward (24) large-freq-roll-out (jumpLength = 50) top-10-mean-reward (25) large-freq-roll-out (jumpLength = 50) random-reward (26) large-freq-roll-out (jumpLength = 100) max-reward (27) large-freq-roll-out (jumpLength = 100) mean-reward (28) large-freq-roll-out (jumpLength = 100) top-2-mean-reward (29) large-freq-roll-out (jumpLength = 100) top-5-mean-reward (30) large-freq-roll-out (jumpLength = 100) top-10-mean-reward (31) large-freq-roll-out (jumpLength = 100) random-reward room dataset. Finally, the description length is not or almost not influenced by the Roll-Out strategies.…”
Section: Strategymentioning
confidence: 99%
“…In [17], the sequence database is split up in smaller parts to be recreated by a sparse knowledge base that punishes for infrequent behavior by constructing a Bayesian network of posteriors that are able to reconstruct the sequence database. A similar approach is used in [18], where a strong emphasis is used towards finding interesting sequences. In contrast to the previously mentioned techniques, iBCM draws from insights in constraint programming, but rather than constructing a complete constraint base that is able to elicit the sequence database as a whole, highly diverse and informative behavioral patterns are used that incorporate cardinality, alteration, gaps, as well as negative information.…”
Section: Sequence Classificationmentioning
confidence: 99%
“…Approaches iBCM is benchmarked against 4 other state-of-the-art techniques, being cSPADE [25], Interesting Sequence Miner (ISM) [17], Sequence Classification based on Interesting Sequences (SCIP) [15], and Mining Sequential Classification Rules (MiSeRe) [18], which all have the clear goal of obtaining discriminative, informative sequences for classification and are compared in Table 5. A comparison with other techniques can be found in the respective works as well.…”
Section: Data and Classificationmentioning
confidence: 99%
See 1 more Smart Citation