Residual Connection
A Residual Connection is a type of skip connection between neurons in a neural network whereby the output of the original path is replaced by the addition of the output of the original path and the output of the alternative (shortcut) path. I.e., if originally the mapping between the two neurons was x -> f(x), with the skip connection the mapping becomes x -> f(x)+x. With the new setup the computation path "f" bypassed by the residual connection becomes the "residual block", because it then learns the "residual" (i.e. the difference) between input, x, and output, f(x)+x. The paper introducing the concept also introduces a few specific neural network architectures which use residual connections extensively -- each known as a "ResNet". Residual connections alleviate the vanishing/exploding gradients problem and makes training deeper networks easier -- this has to do with the fact that the gradient flowing back through the "+" node during backpropagation is simply a copy of the gradient that arrives at the "+" node, so the more residual connections there are, the more likely there are paths from output to input through which the gradients don't change drastically.