2004
DOI: 10.1007/978-3-540-30547-7_20
|View full text |Cite
|
Sign up to set email alerts
|

Employing Maximum Mutual Information for Bayesian Classification

Abstract: In order to employ machine learning in realistic clinical settings we are in need of algorithms which show robust performance, producing results that are intelligible to the physician. In this article, we present a new Bayesian-network learning algorithm which can be deployed as a tool for learning Bayesian networks, aimed at supporting the processes of prognosis or diagnosis. It is based on a maximum (conditional) mutual information criterion. The algorithm is evaluated using a high-quality clinical dataset c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2005
2005
2014
2014

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Information theory has been applied to BNs for the purpose of improving the training process and for performance analysis [14,28,44,45]. For instance, the cross-entropy measure between a trained BN and a known underlying probability distribution is utilized in [45] for evaluating and comparing the learning performance of several training algorithms.…”
Section: Development Of Bayesian Network Performance Metricsmentioning
confidence: 99%
“…Information theory has been applied to BNs for the purpose of improving the training process and for performance analysis [14,28,44,45]. For instance, the cross-entropy measure between a trained BN and a known underlying probability distribution is utilized in [45] for evaluating and comparing the learning performance of several training algorithms.…”
Section: Development Of Bayesian Network Performance Metricsmentioning
confidence: 99%