2008
DOI: 10.1080/02691720802576291
|View full text |Cite
|
Sign up to set email alerts
|

Minimum Message Length and Statistically Consistent Invariant (Objective?) Bayesian Probabilistic Inference—From (Medical) “Evidence”

Abstract: DavidDowe David.Dowe@infotech.monash.edu.au "Evidence" in the form of data collected and analysis thereof is fundamental to medicine, health and science. In this paper, we discuss the "evidence-based" aspect of evidence-based medicine in terms of statistical inference, acknowledging that this latter field of statistical inference often also goes by various near-synonymous names-such as inductive inference (amongst philosophers), econometrics (amongst economists), machine learning (amongst computer scientists) … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
3
0

Year Published

2010
2010
2023
2023

Publication Types

Select...
5
1
1

Relationship

5
2

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 47 publications
0
3
0
Order By: Relevance
“…For performance evaluation, we predicted AMR for the testing set using the fine-tuned models. Performance metrics included accuracy, AUROC, F1-score, precision, recall, and log-loss [48]. In addition, we reported results of true negative (true susceptible), false negative (false susceptible), true positive (true resistant), and false positive (false resistant).…”
Section: Methodsmentioning
confidence: 99%
“…For performance evaluation, we predicted AMR for the testing set using the fine-tuned models. Performance metrics included accuracy, AUROC, F1-score, precision, recall, and log-loss [48]. In addition, we reported results of true negative (true susceptible), false negative (false susceptible), true positive (true resistant), and false positive (false resistant).…”
Section: Methodsmentioning
confidence: 99%
“…Specifically, we will consider a scenario in which the feedback available to the learning algorithm at each point is not only ∆ n , the information of which rounds it had "won" and which it had "lost", but also O = (n) and O = (n), what the bit output by each machine was, at every step. 2 In this scenario, Player "=" can calculate a co-R.E. function by calculating its complement in round n and then reading the result as the complement to O = (n), which is given to it in all later rounds.…”
Section: Conventional Learnabilitymentioning
confidence: 99%
“…Partly in response to Searle's "Chinese room" argument [19], we also raise the issue of compression as a non-behavioural (introspective) indicator of intelligence -i.e., given two agents who have scored equally well on a test and one of which compresses better than the other, which should we prefer [5, 3 We compare this to other purely behavioural ways of assessing and detecting intelligence.…”
mentioning
confidence: 99%