What’s a good way to access a convolutional filter (the learned weights) in the model?
Say you have a model in which you declared self.conv1 = nn.Conv2d(…
During training you want to print out the weights that it learns, or more specifically get one of the filters responsible for the feature maps.
edit: The filter in question nn.Conv2d(1, 1, 5, stride=1, padding=2, bias=False)
How can you call the conv operator with non learnable tensors? Suppose I have that tensor I posted earlier that I really like, I want it to be constant and not have to worry about it being unfrozen during a p.requires_grad = True operation such as is often performed in GANs.
Or I have it as a variable declared outside the class of the model where I have direct control over it from the main execution flow.
If you define the non-learnable weight tensor as a torch.Tensor and store it using module.register_buffer, you can use it during forward by wrapping it in a new Variable.
So I need to manually set self.conv.weight = Variable(filter) inside forward? The way you said “give it to” implies that it can be passed as an argument? I don’t see how.
Don’t save the variable anywhere! It should be created at every forward and use as a local Variable. I don’t really know what you want to do but I guess this might help:
filter = Variable(self.filter) # self.filter is a registered buffer (tensor)
result = F.conv2d(input, filter)