I have a 2-layers fully connected network. I would like to convert the output of the first layer to binary. This means that I would like to have a binary-step activation function in the forward paths and Relu activation function in the backward pass. How can I implement this? Any idea would be appreciated.
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.fc1 = nn.Linear(28 * 28, 64)
self.fc2 = nn.Linear(64, 10)
nn.init.xavier_normal_(self.fc1.weight)
nn.init.xavier_normal_(self.fc2.weight)
def forward(self, x):
x = F.relu(self.fc1(x))
x = self.fc2(x)
x = F.log_softmax(x, dim=1)
return x
net = Net()
But it does not work and it returns binary values during both forward and backward phases.
I also tried to define a new class and define the forward and backward paths separately…
I tried to do it like this but it gives me an error (AttributeError: ‘Binary_AF’ object has no attribute ‘dim’ ).
Do you think it makes sense to write the code like this? if yes, how to take care of the error and if no, I would appreciate if you have any suggestion.
class Binary_AF:
def __init__(self, x):
self.x = x
def forward(self):
self.x[self.x <= 0] = 0
self.x[self.x > 0] = 1
return self.x
def backward(self):
return self.x
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.fc1 = nn.Linear(28 * 28, 64)
self.fc2 = nn.Linear(64, 10)
nn.init.xavier_normal_(self.fc1.weight)
nn.init.xavier_normal_(self.fc2.weight)
def forward(self, x):
x = F.relu(self.fc1(x))
y = Binary_AF(x)
y = self.fc2(y)
y = F.log_softmax(y, dim=1)
return y