Activation Functions - SkyWorld117/DianoiaML.jl GitHub Wiki
Functionality
The activation functions change the features of output data from each layer. The current framework supports None
, ReLU
, Sigmoid
, Softmax
and tanH
. They are modules in Julia.
tanH
In order to avoid using the same name of Julia built-in function tanh
, the name is slightly changed to tanH
.
Softmax
To avoid the numerical instability, please use Softmax_CEL
if you want to use Softmax
with the loss function Cross_Entropy_Loss
.