Deep Neural Net Activation Functions - BKJackson/BKJackson_Wiki GitHub Wiki

activation functions

More activation functions
Source

Relu

Sigmoid

  • a special case of the softmax function for a classifier with only two input classes
  • for comparison, the softmax function takes a vector input while the sigmoid takes a scalar value input

Softmax

  • makes the output sum to 1 so that the output can be interpreted as probabilities
  • the model can use this output to make its prediction based on which option has the highest probability

$$ softmax(x_i) = {e^{x_i}} / \sum^{K}_{k=1} e^{x_k} $$