2020
DOI: 10.1016/j.ijar.2020.01.010
|View full text |Cite
|
Sign up to set email alerts
|

The three-way-in and three-way-out framework to treat and exploit ambiguity in data

Abstract: In this paper, we address ambiguity, intended as a characteristic of any data expression for which a unique meaning cannot be associated by the computational agent for either lack of information or multiple interpretations of the same configuration. In particular, we will propose and discuss ways in which a decision-support classifier can accept ambiguous data and make some (informative) value out of them for the decision maker. Towards this goal we propose a set of learning algorithms within what we call the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
2
1

Relationship

3
6

Authors

Journals

citations
Cited by 35 publications
(18 citation statements)
references
References 40 publications
0
18
0
Order By: Relevance
“…-Decision Tree [40] We also considered a modification of the Random Forest algorithm, called three-way Random Forest classifier [7] (TWRF), which allows the model to abstain on instances for which it can express low confidence; in so doing, a TWFR achieves higher accuracy on the effectively classified instances at expense of coverage (i.e., the number of instances on which it makes a prediction). We decided to consider also this class of models as they could provide more reliable predictions in a large part of cases, while exposing the uncertainty regarding other cases so as to suggest further (and more expensive) tests on them.…”
Section: Model Training Selection and Evaluationmentioning
confidence: 99%
“…-Decision Tree [40] We also considered a modification of the Random Forest algorithm, called three-way Random Forest classifier [7] (TWRF), which allows the model to abstain on instances for which it can express low confidence; in so doing, a TWFR achieves higher accuracy on the effectively classified instances at expense of coverage (i.e., the number of instances on which it makes a prediction). We decided to consider also this class of models as they could provide more reliable predictions in a large part of cases, while exposing the uncertainty regarding other cases so as to suggest further (and more expensive) tests on them.…”
Section: Model Training Selection and Evaluationmentioning
confidence: 99%
“…For each model class, we considered two versions: a standard one, and the three-way version (a model that abstains from prediction when the confidence score is below 75%) [31]. For each of these two versions, the model selection, training and evaluation pipeline was implemented for each of the three datasets mentioned above (the OSR dataset, COVID-specific dataset, and CBC dataset).…”
Section: Machine Learning Experimental Designmentioning
confidence: 99%
“…In addition, the users can decide whether they want an indication from the system, irrespective of the confidence in the advice given, or if they would prefer only to be advised about high-confidence indications, as the three-way approach allows for. This approach was specifically developed to mitigate the risk of automation bias and the odds of machine-induced errors [31].…”
Section: Datasetmentioning
confidence: 99%
“…-Support Vector Machines [36] (SVM). We also considered a modification of the Random Forest algorithm, called three-way Random Forest classifier [7] (TWRF), which allows the model to abstain on instances for which it can express low confidence; in so doing, a TWFR achieves higher accuracy on the effectively classified instances at expense of coverage (i.e., the number of instances on which it makes a prediction). We decided to consider also this class of models as they could provide more reliable predictions in a large part of cases, while exposing the uncertainty regarding other cases so as to suggest further (and more expensive) tests on them.…”
Section: Model Training Selection and Evaluationmentioning
confidence: 99%