2020
DOI: 10.1016/j.patrec.2020.03.008
|View full text |Cite
|
Sign up to set email alerts
|

Metric Learning from Imbalanced Data with Generalization Guarantees

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
1

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(14 citation statements)
references
References 25 publications
0
14
0
Order By: Relevance
“…For the datasets with imbalance ratio ≤ 6, using F1 score we have compared the results with the other standard methods which are frequently used in class imbalance problem in Table 10. The methods include RU [47], ENN [70], NMU [7], CNN [55], prototype generation using K-means clustering (PK) [57], SMOTE [14], Imbalanced Metric Learning (IML) [29], which follows Mahalanobis metric learning algorithm [63], RTHS [1] and AβBSF [31]. Some of these methods are very popular and useful to deal with the class imbalance problem.…”
Section: Comparison With State-of-the-art Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…For the datasets with imbalance ratio ≤ 6, using F1 score we have compared the results with the other standard methods which are frequently used in class imbalance problem in Table 10. The methods include RU [47], ENN [70], NMU [7], CNN [55], prototype generation using K-means clustering (PK) [57], SMOTE [14], Imbalanced Metric Learning (IML) [29], which follows Mahalanobis metric learning algorithm [63], RTHS [1] and AβBSF [31]. Some of these methods are very popular and useful to deal with the class imbalance problem.…”
Section: Comparison With State-of-the-art Methodsmentioning
confidence: 99%
“…Yu et al [67] worked with accelerating evolutionary computation, Cheng et al [17] introduced various model-based evolutionary algorithms (MBEAs) and He et al [34] proposed evolutionary multi-objective optimization and used it on real-world applications. Gautheron et al [29] introduced Mahalanobis metric learning (IML) algorithm. The authors have used the datasets like Pima, Balance, Splice and Heart to evaluate their model's performance.…”
Section: Literature Surveymentioning
confidence: 99%
See 1 more Smart Citation
“…4) to improve the data space (through relabeling and regrouping the neighbouring data blocks) which separates the data samples with a different class label by a large margin and makes the samples with the same class labels close to each other; 2) calculate the distance between each of the samples from the training set to the testing samples; 3) run this and previous steps through multiple iterations, controlled by a predefined matching ratio. [62] shows that IML yields to a classification performance bound but requires the learned matrix conforms to the Positive Semi-Definite (PSD) constraint; its performance is still unknown for non-linear metrics.…”
Section: Distance Metric Learning Methodsmentioning
confidence: 99%
“…Precision, recall, F1-score, and AUC are frequently used as the evaluation metrics when dealing with imbalanced data [32]. Precision and recall measure exactness and completeness, respectively, and demonstrate an inverse relationship.…”
Section: Adam Optimizer and Learning Rate Decaymentioning
confidence: 99%