How to set nn.conv2d weights

For some predefined operation i need to fix the parameters of a convolution layer.
when I have this layer

self.conv1 = nn.Conv2d(1, 5, kernel_size=1, stride=1, padding=0,  bias=False)

the code works fine, however when I set the weights of constitutional filter as

self.conv1 = nn.Conv2d(1, 5, kernel_size=1, stride=1, padding=0,  bias=False)
self.conv1.weight = torch.nn.Parameter(torch.ones((1, 1, 5)))

I received following error

RuntimeError: expected stride to be a single integer value or a list of 1 values to match the convolution dimensions, but got stride=[1, 1]

Any idea how can i solve this problem?

1 Like

The dimensions are not correct: you are assigning a [1, 1, 5] tensor to the weights, whereas self.conv1.weight.size() is torch.Size([5, 1, 1, 1]).


self.conv1.weight = torch.nn.Parameter(torch.ones_like(self.conv1.weight))

and it will work !

1 Like

A great way to know what the shapes of the weight should be is to print the shape of the weights.

1 Like

I want to set and fix weights of nn.conv1d, so this layer in network has fixed parameters and is NOT learnable.
Is self.conv1.weight = torch.nn.Parameter(torch.ones_like(self.conv1.weight))
makes the weights fixed?

No, nn.Parameters require gradients by default, so you would need to set its requires_grad attribute to False.

1 Like

I want to set initial weights and they start learning from that initial weights, so is it true to use bellow script:
self.conv1.weight = torch.nn.Parameter(torch.ones_like(self.conv1.weight),requires_grad=True)

Is there any way to set requires_grad=False just for one element of weight?

No, that’s not possible as you can change the requires_grad attribute for an entire tensor only.
An alternative approach would be to either set the gradients to zero for the desired elements after the backward() operation and before the step() call or to recreate the parameter from different tensors (which use different requires_grad attributes) via or torch.stack. However, the latter approach might be cumbersome depending how complicated the tensor creation would be.

1 Like