About
Updates
ReLU
A Rectified Linear Unit (ReLU) is a non-linear activation function which maps negative values to zero and keeps non-negative values unchanged. It is one of the most common activations used in neural networks.
Related concepts:
Activation
Gaussian Error Linear Unit
Gated Linear Unit