Initialize weights of convolution layer


I want to assign filter = [[[[1, 2, 3], [4, 5, 6], [7, 8, 9]]]]
as weight of convolution layer. Is this command ok?

self.conv1 = conv2d(...)
self.conv1.weight = Parameter(filter)


I would recommend to wrap the assignment in a with torch.no_grad() block just to make sure Autograd won’t complain later.

Thanks a lot.
Is it ok?

class Net(Module):
    def __init__(self):
        super(Net, self).__init__()	
        self.convZ = nn.Conv2d(...)
        with torch.no_grad():
             self.convZ.weight = Parameter(filter)        
        self.pool = nn.AvgPool2d(...)
        self.fc1 = nn.Linear(....)

    def forward(self, x):
        out = torch.atan(self.convZ(x))
        out = self.pool(out)
        out = torch.atan(self.fc1(out))
        return out

Best Regards

Looks alright, although I wouldn’t assign output_after_convZ back to self.convZ, but rather a new attribute.

1 Like

Yes, you are right. It is just a mistake. I edited it.
Many thanks, you are the best :shamrock:

1 Like