Dropout

Dropout is a regularization technique for neural networks which consists of randomly "dropping" (i.e. ignoring) neurons at each step during training, so that on each iteration there is a different sub-network on which the forward and backward passes are made. At validation, test, or deployment time, the full network is used, albeit with a scaling of the activations -- corresponding to the inverse (1/p) of the keep probability (p) -- so that the expected value of the activation at each neuron is the same as during training. Alternatively the scaling can be done during training (if the keep probability of a neuron is p, its activation is scaled by 1/p), so no scaling needs to be made at validation/test/deployment time.
Related concepts:
Regularization
External reference:
https://patents.google.com/patent/US9406017B2/en