Backpropagation

Backpropagation is an algorithm to compute derivatives in a network of mathematical operations (e.g. an artificial neural network) by iteratively propagating known derivatives "backward", starting from the output of the network, according to the chain rule (from Calculus).
Related concepts:
Artificial Neural Network
External reference:
https://www.nature.com/articles/323533a0