AI Cyber Security ICP 2 - Hiresh12/UMKC GitHub Wiki

AI Cyber Security ICP 2

https://github.com/Hiresh12/UMKC/tree/master/CSEE5590%20-%20AI%20Cyber%20Security/ICP2/Source

Inference and Validation

Here we will see how to do prediction using the trained DNN model and perform validation on the test set. For validation, we are calculating the accuracy of the model on the test set

Downloading train and test set

Calculating the loss on train set and performing back propagation

We used loss function – NLLLOSS() and Adam optimizer

Performing inference from the model [model(images)] and calculating the accuracy of the model on the test set.

Output for 10 epochs,

Dropout:

Droupout is used to reduce the overfitting of the model to the train set which impacts the model’s performance on the test/validation set.

DNN model with Dropout

Calculating the accuracy of the model with dropout

Saving and Loading Models

Model can be trained and stored using the following function torch.save() torch.save(model.state_dict(), 'checkpoint.pth')

we can load the state dict with torch.load. state_dict = torch.load('checkpoint.pth') And to load the state dict in to the network, you do model.load_state_dict(state_dict).

model.load_state_dict(state_dict)

We need to define the network with correct tensor sizes to load the model,

model = Network(784, 10, [400, 200, 100])

This will throw an error because the tensor sizes are wrong! model.load_state_dict(state_dict)

Loading Image Data:

I have loaded some other dataset with different size of images and applied transforms like resize, crop, rotation to make the images used by the DNN model.

transform = transforms.Compose([transforms.Resize(255), transforms.CenterCrop(224), transforms.ToTensor()])

Data Loaders

With the ImageFolder loaded, you have to pass it to a DataLoader. The DataLoader takes a dataset (such as you would get from ImageFolder) and returns batches of images and the corresponding labels. You can set various parameters like the batch size and if the data is shuffled after each epoch. dataloader = torch.utils.data.DataLoader(dataset, batch_size=32, shuffle=True)