Custom Conv2d Layer

I am trying to define a custom CNN layer with some manipulations to weights before forward pass. Below is my implementation:

class Myconv(Linear):
    def __init__(self, in_channels, out_channels, kernel_size, iter, stride=1, padding=0, dilation=1, groups=1,  bias=True):
    super(Myconv, self).__init__(in_channels, out_channels, kernel_size)
    
    self.iter=iter
    self.params = nn.ParameterList([torch.nn.Parameter(torch.randn(out_channels), requires_grad=True) for i in range(self.iter)])
           
    
          
    def modify(self, param):
        ..........          
        return W
    
    @property
    def modifyW(self):
        W=self.modify(self.params)       
        return W   
  
    def forward(self, input):
        return F.conv2d(input, self.modifyW, self.bias)

However, during forward pass I get the error: ‘Myconv’ object has no attribute ‘modifyW’.
I implemented the same for linear layer using F.linear() and it works fine.

Any help would be appreciated!

thanks!

Hi,
Could it be something weird because of the missing capital in the super that use some other variable of your code? Or is that just a copy paste error?

@albanD sorry for the typo while posting here, but the error still occurs.

No problem. Weird indeed.
And if you replace the property with a function call it works fine?

@albanD If I remove @property then I get this error: argument ‘weight’ (position 2) must be Tensor, not method

Sorry I meant remove the @property and replace self.modifyW by self.modifyW().
But still it means it found the method, only the property was not found…

@albanD Thanks. Yes it works. But can you explain why is this the case for conv layers and not with linear layer

Hi,

I have no idea, I wanted you to check that to make sure it’s not a problem with code indentation or something like that that would make your definition appear as if it was outside the class.
Glad it works with a function call. But I don’t know why the property does not…

Also your conv inherit from Linear is that expected?

@albanD I have checked the code for indentation errors. In-fact I have just used the same class written for linear layers by replacing
F.linear in forward() to F.conv2d() and
making sure modifyW() returns the weights in correct shape

I will update here in case I could make it work with @property.

Best