2017
DOI: 10.3390/e19050216
|View full text |Cite
|
Sign up to set email alerts
|

Designing Labeled Graph Classifiers by Exploiting the Rényi Entropy of the Dissimilarity Representation

Abstract: Representing patterns as labeled graphs is becoming increasingly common in the broad field of computational intelligence. Accordingly, a wide repertoire of pattern recognition tools, such as classifiers and knowledge discovery procedures, are nowadays available and tested for various datasets of labeled graphs. However, the design of effective learning procedures operating in the space of labeled graphs is still a challenging problem, especially from the computational complexity viewpoint. In this paper, we pr… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 54 publications
0
2
0
Order By: Relevance
“…An edge e ij connecting x i and x j is weighted using a weight based on their distance, |e ij | = d 2 (x i , x j ). The α-order Rényi entropy (1) can be estimated according to a geometric interpretation of an entropic spanning graph of G. Examples of such graphs used in the literature are the MST, k -NN graph, Steiner tree, and TSP graph [9,14,25,36,42,[44][45][46]57]. In this paper, we will focus on the MST [8,44].…”
Section: Graph-based Entropy Estimationmentioning
confidence: 99%
“…An edge e ij connecting x i and x j is weighted using a weight based on their distance, |e ij | = d 2 (x i , x j ). The α-order Rényi entropy (1) can be estimated according to a geometric interpretation of an entropic spanning graph of G. Examples of such graphs used in the literature are the MST, k -NN graph, Steiner tree, and TSP graph [9,14,25,36,42,[44][45][46]57]. In this paper, we will focus on the MST [8,44].…”
Section: Graph-based Entropy Estimationmentioning
confidence: 99%
“…In current implementations [20,24,25], the ODSE model is optimized through a genetic algorithm (GA). The GA operates by performing roulette wheel selection, two-point crossover, and random mutation on the variables representing the model parameters; it implements also an elitism strategy that includes the fittest individual into the next population.…”
Section: A Quick Look Into the Odse Designmentioning
confidence: 99%