DL_ICP4 - Saiaishwaryapuppala/CSEE5590_python_Icp GitHub Wiki
Python and Deep Learning: Special Topics
Rajeshwari Sai Aishwarya Puppala
Student ID: 16298162
Class ID: 35
Deep Learning-In class programming:4
Objectives
Follow the instruction below and then report how the performance changed. (apply all at once)
Did the performance change?
2.predict the first 4 image of the test data. Then, print the actual label for those 4 images (label means the probability associated with them) to check if the model predicted correctly or not
3.Visualize Loss and Accuracy using the history object
Import Data
-
Import the necessary Packages required
-
Import the cifar dataset and load all of the train and test data
Normalize and Encoding
Model1
- Create a convolution layer with activation "Relu" input shape (3,32,32)
- Add a dropout layer with 0.2
- Add a Conv2d Hidden layer with activation function "Relu" and with 32 neurons
- Do the max pooling with (2,2) size.
- Now Flatten the size of the input
- Add a dense hidden layer with activation function "Relu"
- The activation function in the output layer is "softmax" because the target is multi-classification
- The hyperparameters are as follows epochs=2, Lrrate=0.01, optimizer is "SGD"
Accuracy: 70.3%
Model 2
- Convolutional input layer, 32 feature maps with a size of 3×3 and a rectifier activation function.
- Dropout layer at 20%.Convolutional layer, 32 feature maps with a size of 3×3 and a rectifier activation function.
- Max Pool layer with size 2×2.Convolutional layer, 64 feature maps with a size of 3×3 and a rectifier activation function.
- Dropout layer at 20%.
- Convolutional layer, 64 feature maps with a size of 3×3 and a rectifier activation function.
- Max Pool layer with size 2×2.Convolutional layer, 128feature maps with a size of 3×3 and a rectifier activation function.
- Dropout layer at 20%.
- Convolutional layer,128 feature maps with a size of 3×3 and a rectifier activation function.
- Max Pool layer with size 2×2.Flatten layer.Dropout layer at 20%.
- Fully connected layer with 1024units and a rectifier activation function.
- Dropout layerat 20%.
- Fully connected layer with 512units and a rectifier activation function.
- Dropoutlayer at 20%.Fully connected output layer with 10 units and a Softmax activation function
HyperParameters code
Accuracy: 39.8%
Accuracy
Loss
Predict 4 images from the Test dataset
- Code for predicting the results of 4 images in the test data set with the model already got trained
- As you can see the result the second vector is one hot encoded
- It shows 1 that is the class it is representing
- Check the same index value in the predicted values it should be highest compared to other 9 values then it is predicted correctly.
- The third image has 1 for the 9th class and see the values in the predicted values, it is the highest values.
- The 3rd image got predicted correctly
Conclusion
- The Model1 has better accuracy compared to model 2.