[neural network] neural network and deep learning - dsindex/blog GitHub Wiki
-
ebook for neural network and deep learning
- Using neural nets to recognize handwritten digits
- keywords
perceptron sigmoid neuron input layer, hidden layer, output layer gradient decent stochastic gradient decent, mini-batch
- How the backpropagation algorithm works
- keywords
subscript index notation w_jk activation cost function Hadamard product four fundamental equations error delta backpropagation algorithm
- Improving the way neural networks learn
- keywords
quadratic cost function learning slow cross-entropy cost function softmax log-likelihood cost function overfitting and regularization validation data L2 regularization lambda L1 regularization drop-out expanding training data weight initialization, normal distribution tanh activation function
- Why are deep neural networks hard to train?
- keywords
vanishing gradient problem
- Deep Learning
- keywords
convolutional networks local receptive fields stride shared weights and biases, feature maps, kernel, filter convolution operator pooling layers max-pooling, L2 pooling Rectified Linear Units(ReLU)
- Using neural nets to recognize handwritten digits
-
$ python test.py Epoch 13: validation accuracy 99.04% This is the best validation accuracy to date. The corresponding test accuracy is 99.17% Training mini-batch number 70000 Training mini-batch number 71000 Training mini-batch number 72000 Training mini-batch number 73000 Training mini-batch number 74000 Epoch 14: validation accuracy 99.14% This is the best validation accuracy to date. The corresponding test accuracy is 99.23% Training mini-batch number 75000 Training mini-batch number 76000 Training mini-batch number 77000 Training mini-batch number 78000 ...