'Conv2dU' object has no attribute '_backward_hooks'

Hi,

I tired to define a custom conv2D class, but I am getting the this object has no attribute ‘_backward_hooks’. I would greatly appreciated if someone can explain the cause for this error.

Thanks

1 Like

Does your custom Conv2D class inherit from nn.Module or torch.autograd.Function?

I inherit nn.Module and use @autocast from torch.cuda.amp import autocast for forward method

Would you mind sharing the source code for it? It’s a bit hard to visualize the problem without the source.

class Conv2dUnary(nn.Module):
    def __init__(
        self,
        in_channels,
        out_channels,
        kernel_size,
        stride=1,
        padding=0,
        dilation=1,
        bitwidth=2,
        batch_size=1,
        kernel=None,
        bias=None,
    ):
        super(Conv2dUnary).__init__()

        self.in_channels = in_channels
        self.out_channels = out_channels
        self.kernel_h, self.kernel_w = kernel_size
        self.kernel_d = in_channels
        self.stride = stride
        self.padding = padding
        self.dilation = dilation
        self.bitwidth = bitwidth
        self.batch_size = batch_size
        self.kernel = kernel
        self.bias = bias

    @autocast
    def forward(self, input):
 
        return output

I trying to perform convolution by representing data in unary domain. I initially created a method to perform conv2d in unary that works fine, now trying to convert into a nn.Module and use as a layer in NN. I am not sure above the root cause of the issue.

Shouldn’t this be,

Super(Conv2dUnary, self).__init__()

Perhaps you’ve not fully initialized the parent class?

1 Like

Yes, I missed it. Thanks a lot @AlphaBetaGamma96. I will remove the code, as it contains some research information.