2007 International Conference on Convergence Information Technology (ICCIT 2007) 2007
DOI: 10.1109/iccit.2007.390
|View full text |Cite
|
Sign up to set email alerts
|

Comparing Ontologies Using Entropy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2010
2010
2021
2021

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 2 publications
0
5
0
Order By: Relevance
“…There are some indications though. Cho et al (2007) found an average entropy of 17.48 for domain ontologies in WordNet. We found for the basic-level ontology an entropy of 10.67 and for the restricted expert-based ontology without homonyms an entropy of 26.82.…”
Section: Discussionmentioning
confidence: 99%
See 2 more Smart Citations
“…There are some indications though. Cho et al (2007) found an average entropy of 17.48 for domain ontologies in WordNet. We found for the basic-level ontology an entropy of 10.67 and for the restricted expert-based ontology without homonyms an entropy of 26.82.…”
Section: Discussionmentioning
confidence: 99%
“…The entropy of an ontology can be computed by taking the number of relations a concept has and sum over the probabilities of these relations (Calmet & Daemi, 2004;Cho et al, 2007;Doran et al, 2009;Resnik, 1995Resnik, , 1999. It is a measure based on the information content of an ontology (Scharrenbach et al, 2010;Coskun et al, 2011;Palmisano et al, 2009).…”
Section: Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…Here, a probabilistic generalization is created for each data point within the set, and the lengths of each node is generated based on the probability of a given node [column]. As nodes in the hierarchical structure increase, the overall information representation increases in tandem [19]. Algorithm 4 presents our method for producing the intersections found in our ontological representation of the mixed data-type dataset, as seen in 6. , where the final fraction will be raised to a measured coefficient power end end end…”
Section: B Completeness Graphingmentioning
confidence: 99%
“…Distance between two concepts is a numerical representation of how far apart two concepts are from one another in some geometric space, and can be considered the inverse of semantic similarity [15]. The paper [4] compared ontologies using entropy which shows structural features of ontology as the average of information content. However, these two kinds of method have their own limitations.…”
Section: Computing Semantic Similaritymentioning
confidence: 99%