DL_ICP 2 - Saiaishwaryapuppala/CSEE5590_python_Icp GitHub Wiki
Python and Deep Learning: Special Topics
Rajeshwari Sai Aishwarya Puppala
Student ID: 16298162
Class ID: 35
Deep Learning-In class programming: 2
Objectives:
-
Using the history object in the source code, plot the loss and accuracy for both training data and validation data.
-
Plot one of the images in the test data, and then do inferencing to check what is the prediction of the fitted model on that single image in the test data
-
We had used 2 hidden layers and RELU activation. Try to change the number of hidden layers and the activation to TANH or sigmoid and see what happens
-
Run the same code without scaling the images, how the accuracy changes?
Import Data and Reshaping
- Import the necessary Packages required
- Import the minst data and load all of the train and test data(including images and labels)
Scaling
Model
- Deep learning model one hidden layer with activation function "RELU"
- The activation function in the output layer is "softmax" because the target is multi-classification
- The hyperparameters are as follows epochs=20, batchsize=256, optimizer is rmsprop
Output:
Accuracy and Loss plot for the history Object
The accuracy and the loss is plotted for the train and the validation data
Model with Tanh Activation Layer
- Deep learning model with one hidden layer with activation function "tanh" and "relu
- The activation function in the output layer is "softmax" because the target is multi-classification
- The hyperparameters are as follows epochs=20, batchsize=256, optimizer is rmsprop
Model with Sigmoid Activation Layer
- Deep learning model with one hidden layer with activation function "relu" and "sigmoid"
- The activation function in the output layer is "softmax" because the target is multi-classification
- The hyperparameters are as follows epochs=20, batchsize=256, optimizer is rmsprop
Plot and Predict the image in the test data
- Plot one image from the test data
- Predict the label of the test data and verify it
- Here the Accuracy is 100% because predicted label is correct
Output:The predicted value digit is 2
Without Scaling
Results:
- The Accuracy with RELU activation layer is 98.27%
- The Accuracy with RELU and TANH activation layer is 98.32%
- The Accuracy with RELU AND SIGMOID activation layer is 97.9 %
- The Accuracy of the model without scaling is 40.5%