2010
DOI: 10.1016/j.is.2009.10.007
|View full text |Cite
|
Sign up to set email alerts
|

Mining frequent closed patterns in pointset databases

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…For example, F(C support, C weight ) is a constraint function to satisfy both weight and support constraints. In closed frequent pattern mining, 4,16,17,31,36 a small set of representative patterns called closed frequent patterns are mined. Mining closed frequent patterns is semantically the same as mining a complete set of frequent patterns.…”
Section: Related Workmentioning
confidence: 99%
“…For example, F(C support, C weight ) is a constraint function to satisfy both weight and support constraints. In closed frequent pattern mining, 4,16,17,31,36 a small set of representative patterns called closed frequent patterns are mined. Mining closed frequent patterns is semantically the same as mining a complete set of frequent patterns.…”
Section: Related Workmentioning
confidence: 99%
“…Healthcare fraud literature originating from countries outside of the United States has used a variety of sources to acquire medical claims data. The National Health Insurance Administration (NHIA) in Taiwan has provided data to multiple research groups (Chan & Lan, 2001;Hwang, Wei & Yang, 2003;Liou, Tang & Chen, 2008;Wei, Hwang & Yang, 2000;Yang & Hwang, 2006), and an analogous NHIA program in South Korea, the National Health Insurance (NHI) system, has also contributed to studies (Shin, Park, Lee & Jhee, 2012). Two major Australian governmental health departments, the Health Insurance Commission (HIC) and Medicare Australia, have been reported as the data sources in numerous pertinent research projects (He, Graco & Yao, 1999;Hubick 1992;Shan, Jeacocke, Murray & Sutinen, 2008;Shan, Murray & Sutinen, 2009;Tang, Mendis, Murray et al, 2011;Williams, 1999).…”
Section: Introductionmentioning
confidence: 99%
“…Another alternative approach is Reduced Apriori Algorithm with Tag, which improves upon the efficiency of the pruning operation and reduces the burden of candidate generation [20]. Several other algorithms were proposed that focus on closed itemsets, and are able to prune and reduce the data a faster rate, yielding quicker runtimes [22][23][24]. A variety of flavors and combinations of tuning mechanisms provide incremental improvements to the underlying logic workflow of candidate generation and support filtration.…”
Section: Apriorimentioning
confidence: 99%