2018
DOI: 10.1109/tkde.2018.2791525
|View full text |Cite
|
Sign up to set email alerts
|

Heterogeneous Metric Learning of Categorical Data with Hierarchical Couplings

Abstract: Learning appropriate metric is critical for effectively capturing complex data characteristics. The metric learning of categorical data with hierarchical coupling relationships and local heterogeneous distributions is very challenging yet rarely explored. This paper proposes a Heterogeneous mEtric Learning with hIerarchical Couplings, HELIC, for this type of categorical data. HELIC captures both low-level value-to-attribute and high-level attribute-to-class hierarchical couplings, and reveals the intrinsic het… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
22
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 38 publications
(22 citation statements)
references
References 24 publications
0
22
0
Order By: Relevance
“…Learning approaches learn the distance between each pair of categorical levels or a mapping function from each level to a real value by minimizing the classification error [8,32]. More recently, large margin-based metric learning methods have been adapted for ordinal and nominal variables [23,37]. Building on the assumption that an ordinal variable represents a continuous latent variable that falls into an interval of values, [23] jointly learns the Mahalanobis distance, thresholds of intervals, and parameters of the latent variable distribution.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Learning approaches learn the distance between each pair of categorical levels or a mapping function from each level to a real value by minimizing the classification error [8,32]. More recently, large margin-based metric learning methods have been adapted for ordinal and nominal variables [23,37]. Building on the assumption that an ordinal variable represents a continuous latent variable that falls into an interval of values, [23] jointly learns the Mahalanobis distance, thresholds of intervals, and parameters of the latent variable distribution.…”
Section: Related Workmentioning
confidence: 99%
“…As the number of thresholds is determined by the number of variables and levels within them, the method may involve a large number of parameters and suffer from overfitting. [37] represents the categorical data by computing the interaction between levels, between variables, and between variables and classes, followed by learning the Mahalanobis distance in a kernel space. However, it ignores the natural ordering of ordinal variables.…”
Section: Related Workmentioning
confidence: 99%
“…Also, some of the algorithms required solutions, especially for the hard categorized data with relationships of coupling classification and frequency such as the one that was done by Zhu specialized in harmonious metrical learning with couplings classification relying on 30 datasets taken from various fields (Zhu et al, 2018). On the other hand, this kind of algorithm has faced some limitations too such as its disability towards some data properties and controlled the knowledge.…”
Section: Big Data Analysismentioning
confidence: 99%
“…The existing ROPUF based TRNG module will be an efficient method to generate these random numbers thereby increasing the efficiency and security of the system as it maintains randomness in that, no two generated bit-streams can match consecutively and the probability of decoding a 16 bit randomly generated bit stream is very meager. These advantages are inferred from the NIST test performed and using the Hamming distance as a metrics for comparison [11].…”
Section: Pseudo Algorithmmentioning
confidence: 99%