2012
DOI: 10.2478/bile-2013-0011
|View full text |Cite
|
Sign up to set email alerts
|

The use of information and information gain in the analysis of attribute dependencies

Abstract: This paper demonstrates the possible conclusions which can be drawn from an analysis of entropy and information. Because of its universality, entropy can be widely used in different subjects, especially in biomedicine. Based on simulated data the similarities and differences between the grouping of attributes and testing of their independencies are shown. It follows that a complete exploration of data sets requires both of these elements. A new concept introduced in this paper is that of normed information gai… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2012
2012
2019
2019

Publication Types

Select...
3

Relationship

2
1

Authors

Journals

citations
Cited by 3 publications
(4 citation statements)
references
References 15 publications
0
4
0
Order By: Relevance
“…Hydrological systems are characterised with a level of uncertainty [ 1 , 2 ], dispersion or compactness [ 3 , 4 ], uniformity or concentration [ 5 ]. For example, higher entropy is associated with higher dispersion [ 3 , 4 ].…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Hydrological systems are characterised with a level of uncertainty [ 1 , 2 ], dispersion or compactness [ 3 , 4 ], uniformity or concentration [ 5 ]. For example, higher entropy is associated with higher dispersion [ 3 , 4 ].…”
Section: Introductionmentioning
confidence: 99%
“…From information theory perspective, Shannon entropy is the average uncertainty of a random variable and gives on average the minimum number of bits needed, to characterise the random variable [ 6 ]. In other words, entropy is the expected value of a random variable called information and is based in the event’s probability [ 1 , 2 ]. The expected surprise about the truth can be an interpretation of entropy-measure for uncertainty [ 7 ] or ignorance [ 8 ].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…It should be noted that there are different measures of dependency (Jakulin, 2005;Moliński, et al, 2012), but in this paper we focus on a measure proposed by Joe (1989). According to Joe (1989), the quotient…”
Section: Measure Of Dependencymentioning
confidence: 99%