rectified linear unit relu - taoualiw/My-Knowledge-Base GitHub Wiki

The Rectified Linear Unit (ReLU)

The Relu function:

  • is an activation function
  • gives an output of 0 if the input is neagtive or zero and an output equal to the input if the input is positive.
⚠️ **GitHub.com Fallback** ⚠️