2022
DOI: 10.1109/tcyb.2020.3008248
|View full text |Cite
|
Sign up to set email alerts
|

Kullback–Leibler Divergence Metric Learning

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
14
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
3

Relationship

0
10

Authors

Journals

citations
Cited by 68 publications
(17 citation statements)
references
References 47 publications
0
14
0
Order By: Relevance
“…The results will then be compared to the a distribution showing an equal superposition of each eighth note subdivision. This will be done by computing the Kullback-Leibler Divergence (KLD) [16] between the averaged distributions of each of the encoding methods against the equal superposition produced by the quantum circuit in figure 17. Figure 18 shows the results for the spinal cord register for both the simulator and IBMQ Manhattan quantum computer for both the Static and PKBSE encoding methods.…”
Section: Phase Kick Back Results and Analysismentioning
confidence: 99%
“…The results will then be compared to the a distribution showing an equal superposition of each eighth note subdivision. This will be done by computing the Kullback-Leibler Divergence (KLD) [16] between the averaged distributions of each of the encoding methods against the equal superposition produced by the quantum circuit in figure 17. Figure 18 shows the results for the spinal cord register for both the simulator and IBMQ Manhattan quantum computer for both the Static and PKBSE encoding methods.…”
Section: Phase Kick Back Results and Analysismentioning
confidence: 99%
“…2.4.3. Jensen-Shannon Divergence JS divergence [24] is a symmetric divergence measurement based on Kullback-Leibler divergence [25]. By calculating divergences between histograms, a larger divergence indicates smaller correlative relation and smaller similarity between the histograms.…”
Section: Chi-square Distancementioning
confidence: 99%
“…This shows that the distance function is very important in social life [8]. Common distance functions include Euclidean distance [9], Manhattan distance [10], Chebyshev distance [11], standardized Euclidean distance [12], Mahalanobis distance [13], Bhattacharyya distance [14], Kullback-Leibler divergence [15], Hamming distance [16] and cosine distance [17]. Next, we introduce these distance functions in detail.…”
Section: Traditional Distance Functionsmentioning
confidence: 99%