2007
DOI: 10.1007/s10115-007-0104-4
|View full text |Cite
|
Sign up to set email alerts
|

An information-theoretic approach to quantitative association rule mining

Abstract: Abstract. Quantitative Association Rule (QAR) mining has been recognized an influential research problem over the last decade due to the popularity of quantitative databases and the usefulness of association rules in real life. Unlike Boolean Association Rules (BARs), which only consider boolean attributes, QARs consist of quantitative attributes which contain much richer information than the boolean attributes. However, the combination of these quantitative attributes and their value intervals always gives ri… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2009
2009
2022
2022

Publication Types

Select...
8

Relationship

0
8

Authors

Journals

citations
Cited by 33 publications
(29 citation statements)
references
References 30 publications
0
29
0
Order By: Relevance
“…When implementing QARM, attributes are quantitative numeric values not boolean one, so the quantitative association rules are far more expressive and informative than boolean association rules [15]. The continuous value in remote sensing imagery is discretized into five continuous levels which represent sever changes, slight changes and no changes.…”
Section: Quantitative Association Rules Mining (Qarm)mentioning
confidence: 99%
See 1 more Smart Citation
“…When implementing QARM, attributes are quantitative numeric values not boolean one, so the quantitative association rules are far more expressive and informative than boolean association rules [15]. The continuous value in remote sensing imagery is discretized into five continuous levels which represent sever changes, slight changes and no changes.…”
Section: Quantitative Association Rules Mining (Qarm)mentioning
confidence: 99%
“…The NOMI can get rid of the localness and make the normalized mutual information a global measure [15].…”
Section: Normalized Object Mutual Information (Nomi)mentioning
confidence: 99%
“…To reduce the number of database scans, many algorithms have been developed in recent decades. Considering that database scans mostly depend on the numbers of frequent 1-itemsets, mutual information is used to pre-extract the pair-wise related items and to then find all frequent itemsets [13,14]. By first filtering the unrelated 1-itemsets, the mutual-information-based algorithms greatly reduce the number of database scans and thus improve the implementation efficiency [15].…”
Section: Introductionmentioning
confidence: 99%
“…Normalized mutual information is the amount of information one item (X) provides about another (Y) [21], as shown in Equations (1)- (3).…”
Section: Basic Concepts and Propertiesmentioning
confidence: 99%
“…Normalized mutual information tells the amount of information one item provides about the other [20]. As an asymmetry, normalized mutual information can produce the causal effect relationship between items, which has been widely used to find associated or correlated patterns in data mining [21][22][23][24]. However, in real applications, the cascading pattern, in a form of "X 1 → X 2 → .…”
Section: Introductionmentioning
confidence: 99%