wpron
(wiktor)
September 14, 2017, 8:53am
1
Hi there,
I am new to PyTorch, but I want to create a verysimple linear layer with custom connections.
As an example, I want to create the following connectivity:
My code does not work:
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.l1 = nn.Linear(n_feats, 3, bias=False)
self.t = Variable(torch.randn(N_BATCH, 2))
self.neurons = [nn.Linear(2, 1), nn.Linear(2, 1)]
def forward(self, x):
x = F.relu(self.l1(x))
self.t[:, 0] = self.neurons[0](x[:, :2])
self.t[:, 1] = self.neurons[1](x[:, 1:])
return t
Besides being ugly, the I expect to have (n_feats x 3 + 2 * 2 + 2) parameters. My model displays only (n_feats x 3) parameters.
What is the correct way to do this?
alexis-jacq
(Alexis David Jacq)
September 14, 2017, 1:35pm
2
You can either force the weights of the non-existent connexions to equal zero, or do (I think it’s what you are trying) two different fully connected layers. the second solution seems more elegant.
class Net(nn.Module):
def __init__(self):
super(Net, self).__init__()
self.fc1 = nn.Linear(2,1)
self.fc2 = nn.Linear(2,1)
def forward(self, x):
splitted = torch.split(x,1,1)
x1 = torch.cat(splitted[:2],1)
x2 = torch.cat(splitted[1:],1)
x1 = F.relu(self.fc1(x1))
x2 = F.relu(self.fc1(x2))
return torch.cat([x1,x2],1)
1 Like
Lee_Jim
(Lee Jim)
September 18, 2018, 5:55pm
3
no such thing as ugly or elegant, cpux, write any is ok
uhmbg
March 27, 2020, 6:55pm
4
How would the backward work? In the same way? Or is there a change to be made?