Setting weights for a custom convolution layer

Hello I am trying to implement a custom convolution, which is based around a local binary filter.
In order to do so, I have to create multiple convolutions whose weights can’t be changed and just have 0, apart from the middle which is 0 and then one 1 in a location around it, so for 3x3 conv it would be 8 different masks.
However with the code I wrote I am not able to set the weights in an explicit manner.

count = 0
self.conv = nn.Conv2d(1, 1, m, bias=False)
self.conv = self.conv.weight
self.conv.requires_grad = False
for u in range(m*m-1):
for i in range(m):
for j in range(m):
if u == count:
self.conv.weight[i][j] = 1
else:
self.conv.weight[i][j] = 0
self.conv.weight[round(m/2)][round(m/2)] = -1

        count = count + 1
        print(self.conv)

doing it this way only delivers me this error message:
Traceback (most recent call last):
File “C:\Users\Ron Rödel\PycharmProjects\lbcnnn\main.py”, line 46, in
o = Lbcnn(3)
File “C:\Users\Ron Rödel\PycharmProjects\lbcnnn\main.py”, line 25, in init
self.conv.weight[i][j] = 1
AttributeError: ‘Parameter’ object has no attribute ‘weight’

If I apply what I read with is instance it just seems to ignore the if statement alltogether
here is how I wrote the if isInstance statement:
if isinstance(u == count, nn.Conv2d):
self.conv.weight[i][j] = 1
I get to the print statement now, but now of the weights have been changed.

So all in all I am just confused and I apologize for the bad formating

You are re-assigning self.conv to the self.conv.weight:

self.conv = self.conv.weight

and are then trying to access the .weight attribute again, which will fail.
Directly assigning values to the .weight will work:

conv = nn.Conv2d(1, 1, 3)
with torch.no_grad():
    conv.weight[0, 0, 1, 1] = 0.
print(conv.weight)
# Parameter containing:
# tensor([[[[ 0.2617,  0.0339,  0.2769],
#           [ 0.2088,  0.0000, -0.2846],
#           [-0.2943, -0.0077, -0.0271]]]], requires_grad=True)

Directly assigning it to .weight got rid of the Attribute error, however now I am facing an indexing error. To clarify I only want to assign a weight to one specific location at a time.
And still thanks for the first tip

I don’t know what kind of indexing error you are seeing so feel free to post a code snippet reproducing the error in case you get stuck.

I can point it down somewhat
if I write it as:
print(self.conv.weight[0][0]), I get a print of the tensor
or if I assign it like so:
self.conv.weight[0] = 1
I get a tensor where all values are 1
if I write it like self.conv.weight[1] = 1
then I get this error:

IndexError: index 1 is out of bounds for dimension 0 with size 1

The thing I want to do is, to assign a value directly to a spot of the filter, so I figured
if I were to it like so it should work:

            for i in range(m):
                for j in range(m):
                    if u == count:
                        with torch.no_grad():
                            self.conv.weight[i][j] = 1

I get the aformentioned error

The shape of the filter is [out_channels, in_channels, height, width]. If you want to access the spatial dimensions (height and width) index dim2 and dim3 as seen in my example.

thanks alot, that did the trick