2013
DOI: 10.1155/2013/716753
|View full text |Cite
|
Sign up to set email alerts
|

Pattern Recognition in Numerical Data Sets and Color Images through the Typicality Based on the GKPFCM Clustering Algorithm

Abstract: We take the concept of typicality from the field of cognitive psychology, and we apply the meaning to the interpretation of numerical data sets and color images through fuzzy clustering algorithms, particularly the GKPFCM, looking to get better information from the processed data. The Gustafson Kessel Possibilistic Fuzzy c-means (GKPFCM) is a hybrid algorithm that is based on a relative typicality (membership degree, Fuzzy c-means) and an absolute typicality (typicality value, Possibilistic c-means). Thus, usi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 24 publications
0
5
0
Order By: Relevance
“…With the rise of deep neural networks, handcrafted features were replaced with CNN features in prototype learning, thus achieving end-to-end integration in deep networks and high precision and robustness in various image processing tasks. The differences between the great variety of existing approaches can be roughly grouped by: i) the number of prototypes used to represent a category (1-per-class [19,20,23,26,28,29,34], n-perclass [25,27,30,31,33], sparse [18]); ii) the distance measure (or measures combination) used to stand for the similarity between each instance-prototype pair (Euclidean distance [18-20, 25, 26, 30, 31, 33], Mahalanobis distance [19,20], Co-variance distance [29], Cosine distance [26], Learned distance [27,28], Hand-designed distance [33,34]); and iii) the approach used for prototype representation (prototype-template image [18,23], mean vector of embedded features [25,26,28,29,31], learned centroid vector [19,20,27,30,34], learned CNN-tensor [33]).…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…With the rise of deep neural networks, handcrafted features were replaced with CNN features in prototype learning, thus achieving end-to-end integration in deep networks and high precision and robustness in various image processing tasks. The differences between the great variety of existing approaches can be roughly grouped by: i) the number of prototypes used to represent a category (1-per-class [19,20,23,26,28,29,34], n-perclass [25,27,30,31,33], sparse [18]); ii) the distance measure (or measures combination) used to stand for the similarity between each instance-prototype pair (Euclidean distance [18-20, 25, 26, 30, 31, 33], Mahalanobis distance [19,20], Co-variance distance [29], Cosine distance [26], Learned distance [27,28], Hand-designed distance [33,34]); and iii) the approach used for prototype representation (prototype-template image [18,23], mean vector of embedded features [25,26,28,29,31], learned centroid vector [19,20,27,30,34], learned CNN-tensor [33]).…”
Section: Related Workmentioning
confidence: 99%
“…In general, prototype learning methods [18][19][20][21][22][23][24][25][26][27][28][29][30][31][32][33][34] represent the prototype as a centroid vector computed using all category members. In contrast to those proposals that assume the prototype as a centroid element (prototype-object), and based on Rosch's prototype definition, we propose representing the prototype as a semantic entity with a center and boundaries computed using only the typical members.…”
Section: Semantic Prototype Representationmentioning
confidence: 99%
See 3 more Smart Citations