NN Module - selenecodes/IPASS GitHub Wiki
class NN(stackSize=4)
Base class for all neural network modules.
Your models should also subclass this class.
Modules can also contain other Modules, allowing to nest them in a tree structure. You can assign the submodules as regular attributes::
import torch.nn as nn
import torch.nn.functional as F
class Model(nn.Module):
def __init__(self):
super(Model, self).__init__()
self.conv1 = nn.Conv2d(1, 20, 5)
self.conv2 = nn.Conv2d(20, 20, 5)
def forward(self, x):
x = F.relu(self.conv1(x))
return F.relu(self.conv2(x))
Submodules assigned in this way will be registered, and will have their parameters converted too when you call :meth:to
, etc.
Source Code: From Github with ❤️ - NN.py
stackSize : int
- the count of frames per state to run the network on.
torch.nn.modules.module.Module
Neural Network Weights Initialiser
m : NN
- Reference to the NN module itself
Source code
@staticmethod
def initWeights(m):
""" Neural Network Weights Initialiser
Parameters
-------
m : NN
Reference to the NN module itself
"""
if isinstance(m, nn.Conv2d):
nn.init.xavier_uniform_(m.weight, gain=nn.init.calculate_gain('relu'))
nn.init.constant_(m.bias, 0.1)
Feed forward function
x
- The network's input
alpha : int
- The network's alpha value
beta : int
- The network's beta value
v
- The predicted value
Source code
def forward(self, x):
""" Feed forward function
Parameters
-------
x
The network's input
Returns
-------
alpha : int
The network's alpha value
beta : int
The network's beta value
v
The predicted value
"""
x = self.cnn(x).view(-1, 256)
v = self.v(x)
x = self.fc(x)
alpha = self.alpha(x) + 1
beta = self.beta(x) + 1
return (alpha, beta), v