2019
DOI: 10.48550/arxiv.1912.08286
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Bias-Variance Tradeoff: Textbooks Need an Update

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…This results in many predictions with high bias and low variance. For example, the bias-variance tradeoff for the KNN algorithm varies depending on the number of nearest neighbours used, where using low values for the number of nearest neighbours can induce overfitting whilst using high values for the number of nearest neighbours can induce underfitting (Valencia-Zapata et al 2017;Neal 2019). For the SRKNN algorithm, we examine a wide range of number of nearest neighbours from 1 to 25 but this range could be extended with increased computation in future work to explore a larger number of nearest neighbours.…”
Section: Practicality Of the Machine Learning Techniques Usedmentioning
confidence: 99%
“…This results in many predictions with high bias and low variance. For example, the bias-variance tradeoff for the KNN algorithm varies depending on the number of nearest neighbours used, where using low values for the number of nearest neighbours can induce overfitting whilst using high values for the number of nearest neighbours can induce underfitting (Valencia-Zapata et al 2017;Neal 2019). For the SRKNN algorithm, we examine a wide range of number of nearest neighbours from 1 to 25 but this range could be extended with increased computation in future work to explore a larger number of nearest neighbours.…”
Section: Practicality Of the Machine Learning Techniques Usedmentioning
confidence: 99%
“…A common way to interpret the VC dimension bound is in terms of bias-variance tradeoff (Neal et al, 2018;Neal, 2019), which is a simple heuristic that is widely used in machine learning. Bias-variance tradeoff captures the intuition that there is a tradeoff between a model being too simple, when it cannot properly represent the data (large bias), and a model being too complex (large capacity) when it will tend to overfit, leading to large variance on unseen data.…”
Section: Comparing Existing Bounds Against Desideratamentioning
confidence: 99%