2008
DOI: 10.1007/978-3-540-68125-0_12
|View full text |Cite
|
Sign up to set email alerts
|

Feature Construction Based on Closedness Properties Is Not That Simple

Abstract: Abstract. Feature construction has been studied extensively, including for 0/1 data samples. Given the recent breakthrough in closedness-related constraint-based mining, we are considering its impact on feature construction for classification tasks. We investigate the use of condensed representations of frequent itemsets (closure equivalence classes) as new features. These itemset types have been proposed to avoid set counting in difficult association rule mining tasks. However, our guess is that their intrins… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
3
0

Year Published

2009
2009
2016
2016

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 20 publications
1
3
0
Order By: Relevance
“…In this paper, we investigate further the use of δ-freeness (and thus δ-closedness) when considering feature construction from noisy training data. From that perspective, it extends our previous work [11] which focussed on noise-free samples only. Our proposal can be summarized as follows.…”
Section: Introductionsupporting
confidence: 75%
See 2 more Smart Citations
“…In this paper, we investigate further the use of δ-freeness (and thus δ-closedness) when considering feature construction from noisy training data. From that perspective, it extends our previous work [11] which focussed on noise-free samples only. Our proposal can be summarized as follows.…”
Section: Introductionsupporting
confidence: 75%
“…Using such patterns is the core of our approach to robust feature construction. Following the proposals from, for instance, [10,11,12], we consider that attribute sets may be more relevant than single attributes for class discrimination. Then, pattern types based on the so-called closedness properties enable to avoid redundant features in an application-independent setting.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…For each mined rule π, a new Boolean attribute (feature) is created: the value of this new feature for a training object t of the data set r is (1) true if t supports the body of π, (0) false otherwise. This feature construction process is certainly the most straightforward but has also shown good predictive performance in several studies [8,14]. To provide predictions for new incoming (test) objects, we use a Selective Naive Bayes classifier (snb) on the recoded data set.…”
Section: Algorithm 1 Macatia: the Modl-rule Minermentioning
confidence: 99%