The degradation problem refers to the phenomenon where adding more layers to a neural network leads to higher training error, despite the expectation that deeper networks should perform better. This issue becomes particularly significant in deep learning, where increasing depth can cause performance to saturate or even decline, rather than improve, due to challenges like vanishing gradients and optimization difficulties.
congrats on reading the definition of degradation problem. now let's actually learn it.