2021
DOI: 10.48550/arxiv.2106.11119
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Graceful Degradation and Related Fields

Jack Dymond

Abstract: When machine learning models encounter data which is out of the distribution on which they were trained they have a tendency to behave poorly, most prominently over-confidence in erroneous predictions. Such behaviours will have disastrous effects on real-world machine learning systems. In this field graceful degradation refers to the optimisation of model performance as it encounters this out-of-distribution data. This work presents a definition and discussion of graceful degradation and where it can be applie… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 69 publications
0
2
0
Order By: Relevance
“…This is partly due to the fact that there are two types of uncertainty—aleatory and epistemic uncertainty. In this case, the aleatory uncertainty cannot be eliminated, because it is an inherent part of the observed process or object ( 46 ). On the other hand, there are many calibration algorithms, each of which depends on a large number of hyperparameters.…”
Section: Discussionmentioning
confidence: 99%
“…This is partly due to the fact that there are two types of uncertainty—aleatory and epistemic uncertainty. In this case, the aleatory uncertainty cannot be eliminated, because it is an inherent part of the observed process or object ( 46 ). On the other hand, there are many calibration algorithms, each of which depends on a large number of hyperparameters.…”
Section: Discussionmentioning
confidence: 99%
“…Producing models which give confidence aware output distributions is one method of identifying both adversarial and OOD inputs, and these models have been studied in great detail [12,33,3]. Some methods incorporate additional data into their methodology [11,44], some use data augmentation/generation [30,25,20,36,41,45], while others use probabilistic models to adapt their loss functions [37,28,6,47].…”
Section: Related Workmentioning
confidence: 99%