On the Bias-Variance Tradeoff: Textbooks Need an Update

  • The recent accumulating evidence that test error does not behave as a U-shaped curve in neural networks seems to suggest that it might not be necessary to trade bias for variance.
  • The bias-variance decomposition does not necessarily imply a tradeoff, e.g., in neural networks both bias and variance decrease with network width, in the over-parameterized regime.
  • We do not comprehensively know in what models we see the classic U-shaped curve, the double descent curve, or something else. The test error curve may very well look different, depending on how model complexity is varied.

For more, see https://www.bradyneal.com/bias-variance-tradeoff-textbooks-update