Double Descent

Double descent is the phenomenon in which the test error of a model (particularly deep neural networks) first decreases, then increases, then decreases again, as a function of its "size" (the number of parameters), or of the number of training epochs. This is surprising in light of the bias-variance trade-off associated with most "classical" ML models.
Related concepts:
Bias-Variance Trade-Off
External reference:
https://arxiv.org/abs/1912.02292