Accessing a variable inside the model

What’s a good way to access a convolutional filter (the learned weights) in the model?

Say you have a model in which you declared self.conv1 = nn.Conv2d(…
During training you want to print out the weights that it learns, or more specifically get one of the filters responsible for the feature maps.

edit: The filter in question nn.Conv2d(1, 1, 5, stride=1, padding=2, bias=False)

1 Like

Try self.conv1.weight.

1 Like

That works.

Parameter containing:
(0 ,0 ,.,.) = 
 -0.1781 -0.3750  0.0752 -0.1060  0.1356
  0.1607 -0.2711  0.1783  0.2942  0.0471
  0.1992  0.0228 -0.1627 -0.4729 -0.0560
  0.1801 -0.0715  0.0305 -0.0124 -0.1072
  0.2290  0.3730  0.1166 -0.1296  0.0746
[torch.cuda.FloatTensor of size 1x1x5x5 (GPU 0)]

How can you call the conv operator with non learnable tensors? Suppose I have that tensor I posted earlier that I really like, I want it to be constant and not have to worry about it being unfrozen during a p.requires_grad = True operation such as is often performed in GANs.

Or I have it as a variable declared outside the class of the model where I have direct control over it from the main execution flow.

If you define the non-learnable weight tensor as a torch.Tensor and store it using module.register_buffer, you can use it during forward by wrapping it in a new Variable.

It’s not obvious how can you force conv2d to use it?

Wrap it in a Variable at every forward and give it to Conv2d

So I need to manually set self.conv.weight = Variable(filter) inside forward? The way you said “give it to” implies that it can be passed as an argument? I don’t see how.

It can be passed as an argument to F.conv2d.

Don’t save the variable anywhere! It should be created at every forward and use as a local Variable. I don’t really know what you want to do but I guess this might help:

filter = Variable(self.filter) # self.filter is a registered buffer (tensor)
result = F.conv2d(input, filter)