1996
DOI: 10.1109/34.494647
|View full text |Cite
|
Sign up to set email alerts
|

Weighted Parzen windows for pattern classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0

Year Published

2001
2001
2019
2019

Publication Types

Select...
3
3
3

Relationship

0
9

Authors

Journals

citations
Cited by 97 publications
(31 citation statements)
references
References 10 publications
0
31
0
Order By: Relevance
“…3 is the size of data set [12][13][14]. While calculating the density function, the windowing results multiply by the density function, but as the objective is to obtain coefficient for data, this multiplication is not required.…”
Section: Density Weighted K-nearest Neighbors Algorithm For Outliers mentioning
confidence: 99%
“…3 is the size of data set [12][13][14]. While calculating the density function, the windowing results multiply by the density function, but as the objective is to obtain coefficient for data, this multiplication is not required.…”
Section: Density Weighted K-nearest Neighbors Algorithm For Outliers mentioning
confidence: 99%
“…The kernel function used in this paper is called the Parzen Gaussian kernel [19]. It is given by (5) For the ToA estimation problem, the matrix is set to be an identity matrix of size .…”
Section: A Zero Memory Estimationmentioning
confidence: 99%
“…The fifth and sixth classifiers are a Parzen window classifier [13] and a K nearest neighbor (K-NN) classifier [14]. The Parzen window classifier estimates a posterior density function for each class based on a window function, the set of training vectors, and the test vector.…”
Section: Classification Systemsmentioning
confidence: 99%