Vanishing Gradients
Vanishing Gradients is the phenomenon whereby the gradients of the weights in a neural network converge to zero, making the weights difficult to update via backpropagation. It is a common issue in very deep feedforward networks as well as in recurrent networks -- although there are techniques that help resolving the problem (such as residual connections). A related phenomenon is that of "exploding gradients", where the magnitude of the gradients tend to infinity -- this can be remediated by clipping them (if they pass a given upper bound).