Lab Assignment 3 - sirisha1206/Python GitHub Wiki

Name:Naga Sirisha Sunkara

Class ID:34

Team ID:4

Technical partners details:

Name:Vinay Santhosham

Class ID:28

Introduction

We have learnt about different deep learning algorithms and how to use tensor board like showing the graphs in tensor board.

Objective

The main objective of this assignment is to implement the logistic regression and word embedding and show the graph in tensor board.

Task1:

Logistic regression

For this task we have chosen the MNIST dataset.We have changed the different hyper parameters and observed the accuracy. The hyper parameters include optimizers,learning rate,number of steps,step size.

For the 4 optimizers we tried to change the number of steps,batch size,learning rate and observed the accuracy values.

Observations:

For all the different optimizers implemented,we found that gradient descent optimizer got the highest accuracy of 92% for learning rate 0.9.But the value has decreased to 84% for learning rate of 0.01. Adagrad Optimizer has the consistent accuracy of 91% for different different values of learning rate.

Test Case 1: RMS Optimizer

Parameters:

Accuracy Ouptput:

Parameters:

Accuracy Ouptput:

Tensor Board:

Test Case 2: AdaGrad Optimizer

Parameters:

Accuracy Ouptput:

Parameters:

Accuracy Ouptput:

Tensor Board:

Test Case 1: Adam Optimizer

Accuracy Ouptput:

Parameters:

Accuracy Ouptput:

Tensor Board:

Test Case 4: Gradient Descent Optimizer

Parameters:

Accuracy Ouptput:

Parameters:

Accuracy Ouptput:

Tensor Board:

Task 2:

Word Embeddings

For this task we have chosen the ewik8 dataset in 'http://mattmahoney.net/dc/enwik8.zip' URL.We have changed the different hyper parameters like optimizers,learning rate,number of steps,step size and observed the loss.

Observations:

We observed that Adam Optimizer has lowest loss(3) when the learning rate is low like 0.01 ,but loss has gone up on increasing the learning rate.Adagrad Optimizer also has the low loss of 2 which is very near to the Adam Optimizer.

Test Case 1:

Parameters:

Output:

Plot:

Test Case 2:

Parameters:

Output:

Plot:

Test Case 3:

Parameters:

Output:

Plot:

Test Case 4: Adam Optimizer

Parameters:

Output:

Plot:

Tensor Board:

Test Case 5:RMS Optimizer

Parameters:

Output:

Plot:

Tensor Board:

Test Case 6:Gradient Descent Optimizer

Parameters:

Output:

Plot:

Tensor Board:

Test Case 7:Adagrad Optimizer

Parameters:

Output:

Plot:

Tensor Board:

Conclusion:

All the tasks have been implemented successfully.

Source Code

Video Link