2009 2nd Conference on Data Mining and Optimization 2009
DOI: 10.1109/dmo.2009.5341919
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic data discretization technique based on frequency and K-Nearest Neighbour algorithm

Abstract: In this paper we propose a new approach to the dynamic data discretization technique. The technique is called Frequency Dynamic Interval Class (FDIC). FDIC consists of two important phases: The dynamic intervals class phase and the interval merging phase. The first phase uses a simple statistical frequency measure to obtain the initial intervals while in the second phase a KNearest Neighbour is used to calculate the merging factor for the unknown intervals. The experimental results showed that FDIC generates m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 13 publications
0
2
0
Order By: Relevance
“…Ahmed et al [9] have developed the non-parametric FDIC method. In the first phase of this advanced method, initial intervals are constructed based on basic statistical frequency measures.…”
Section: Related Work 21 Discretisation Of Temporal Datamentioning
confidence: 99%
“…Ahmed et al [9] have developed the non-parametric FDIC method. In the first phase of this advanced method, initial intervals are constructed based on basic statistical frequency measures.…”
Section: Related Work 21 Discretisation Of Temporal Datamentioning
confidence: 99%
“…For example, Rajagopalan and Ray [9] investigated two partitioning schemes, namely, maximum entropy partitioning (MEP) (which is based on equal frequency of symbol occurrence) and uniform space partitioning (USP) (which is based on equal partitioning segment width) for one-dimensional data. Ahmed et al [17] investigated a data discretization technique called frequency dynamic interval class (FDIC), which is based on the data statistical frequency measure and k-nearest neighbor (kNN) algorithm [18]. To obtain "optimal" partitioning locations, Shannon entropy measure [2] has been used by researchers to maximize the dynamic information level in the symbol series.…”
mentioning
confidence: 99%