2020
DOI: 10.1007/978-3-662-62271-1_6
|View full text |Cite
|
Sign up to set email alerts
|

Enabling Decision Support Through Ranking and Summarization of Association Rules for TOTAL Customers

Abstract: Our focus in this experimental analysis paper is to investigate existing measures that are available to rank association rules and understand how they can be augmented further to enable real-world decision support as well as providing customers with personalized recommendations. For example, by analyzing receipts of TOTAL customers, one can find that, customers who buy windshield wash, also buy engine oil and energy drinks or middle-aged customers from the South of France subscribe to a car wash program. Such … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 22 publications
0
2
0
Order By: Relevance
“…In addition to support and confidence metrics, other criteria have been proposed to measure the attractiveness of rules. Important metrics are as follows: 1) Lift [25] 2) Conviction [26] 3)Jaccard [27] 4) Leverage of Novelty [28] 5) Dependency Matrix [29].…”
Section: Metrics For Accepting Association Rulesmentioning
confidence: 99%
“…In addition to support and confidence metrics, other criteria have been proposed to measure the attractiveness of rules. Important metrics are as follows: 1) Lift [25] 2) Conviction [26] 3)Jaccard [27] 4) Leverage of Novelty [28] 5) Dependency Matrix [29].…”
Section: Metrics For Accepting Association Rulesmentioning
confidence: 99%
“…We use a classical Item-based Collaborative Filtering (IBCF) approach [24] which calculates a similarity sim(i, j) between each pair of products i and j using cosine similarity. Our model accommodates other similarity functions and recommendation strategies such as association rules [3]. We choose IBCF for its better precision on our dataset [1].…”
Section: Data Model and Preliminariesmentioning
confidence: 99%