ReLU (Rectified Linear Unit) - SoojungHong/MachineLearning GitHub Wiki

In the context of artificial neural networks, the rectifier is an activation function defined as the positive part of its argument:

f ( x ) = x^+ = max ( 0 , x ) where x is the input to a neuron.

Plot of the rectifier (blue) and softplus (green) functions near x = 0