ReLU - AshokBhat/ml GitHub Wiki
Description
- Activation function
ReLU(x) = max(0, x)
FAQ
- Is ReLU linear?
- Non-linear.
- What are the advantages of ReLU?
- Avoids and fixes the vanishing gradient problem.
- Less computationally expensive
See also
- [Activation function]] : [[ReLU]] ](/AshokBhat/ml/wiki/[Softmax) | [Sigmoid]]