Hi there,
I have a simple question. I want to build a simple DNN, but have the number of linear layer passed in as a parameter, so that the users can define variable number of linear layers as they see fit. But I have not figured out how to do this in pytorch. For example, I can easily define a three layer DNN like this,
class DNN(nn.Module):
def __init__(self, nb_units, input_dim, output_dim)
super(DNN, self).__init__()
self.fc1 = nn.Linear(input_dim, nb_units)
self.fc2 = nn.Linear(nb_units, nb_units)
self.fc3 = nn.Linear(nb_units, output_dim)
def forward(self, x):
x = F.relu(self.fc1(x))
x = F.relu(self.fc2(x))
x = F.sigmoid(self.fc3(x))
return x
Now I also want to be able to pass the number of layers as a parameter as well, I have tried this solution:
class DNN(nn.Module):
def __init__(self, nb_layers, nb_units, input_dim, output_dim)
super(DNN, self).__init__()
self.nb_layers = nb_layers
fc = []
for i in range(nb_layers):
if i == 0:
fc.append(nn.Linear(input_dim, nb_units))
elif i == nb_layers-1:
fc.append(nn.Linear(nb_units, output_dim))
else:
fc.append(nn.Linear(nb_units, nb_units))
self.fc = fc
def forward(self, x):
for i in range(self.nb_layers):
if i == self.nb_layers-1:
x = F.sigmoid(self.fc[i](x))
else:
x = F.relu(self.fc[i](x))
return x
You can see that I essentially put layer definitions in a list and use them one by one in this forward call. But with this approach. Pytorch gave me an error. So can anyone gives me some help for this problem? How can do what I want to do with in Pytorch?
Thanks a lot!