2019
DOI: 10.1016/j.ins.2019.06.068
|View full text |Cite
|
Sign up to set email alerts
|

An off-center technique: Learning a feature transformation to improve the performance of clustering and classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1
1

Relationship

1
5

Authors

Journals

citations
Cited by 9 publications
(9 citation statements)
references
References 19 publications
0
8
0
Order By: Relevance
“…As the objective function E(W) quantifies a loss, the latter must be minimized. For more information, see the main reference of Dasen et al [11] Given two ligands p and q whose initial representations are denoted x p ! and x q !…”
Section: Wmlmentioning
confidence: 99%
See 3 more Smart Citations
“…As the objective function E(W) quantifies a loss, the latter must be minimized. For more information, see the main reference of Dasen et al [11] Given two ligands p and q whose initial representations are denoted x p ! and x q !…”
Section: Wmlmentioning
confidence: 99%
“…x train i Tc train k cluster train k (11) In order to predict the membership of a new ligand in an already constructed cluster, this ligand is assigned to the cluster which minimizes the similarity between its centroid and this point [37] as defined in Equation 12.…”
Section: Predictive Clusteringmentioning
confidence: 99%
See 2 more Smart Citations
“…It is worth noting that the Min-Max property is similar to an off-center technique proposed in metric learning [40]. The off-center technique is to achieve a better representation in feature space based on such an idea that, given the center being 0.5, the similarity after transformation is require to tend towards zero (the minimum) or one (the maximum) if the similarity before the transformation is less than or more than 0.5, respectively.…”
Section: Min-max Propertymentioning
confidence: 99%