ReLU

A Rectified Linear Unit (ReLU) is a non-linear activation function which maps negative values to zero and keeps non-negative values unchanged. It is one of the most common activations used in neural networks.
Related concepts:
ActivationGaussian Error Linear UnitGated Linear Unit