How to manipulate the weight of a conv2d layer?

I want to create my own conv2d layer, say myConv2d, the only change is the weight matrix, all other things will be exactily the same as pytorch defines.

I noticed that Conv2d inherits _ConvNd, as stated in the document, so i just write a similar thing like below:

#import all other needed libraries
from torch.nn.modules.conv import _ConvNd

class myConv2d(_ConvNd):
    def __init__(...):

    def conv2d_forward(self, input, weight):
        #here i want to modify the behaviour of weight, say, weight*10
        weight = weight * 10
        return F.conv2d(input, weight, self.bias, self.stride,
                        self.padding, self.dilation, self.groups)

    def forward(self, input):
        return self.conv2d_forward(input, self.weight)

I saved above code in a file and import it in my network file:

from .myConv2d import myConv2d

class Net(nn.Module):

    def __init__(...):
        super(Net, self).__init__()
        self.conv1 = nn.Conv2d(...) #normal conv
        self.conv2 = myConv2d(...) #my conv

when i run the network, error pops:

RuntimeError: Input type (torch.cuda.FloatTensor) and weight type (torch.FloatTensor) should be the same

means the weight in myConv2d is not passed to cuda…and actually I think this is because myConv2d not even belongs to my network graph(because it is not passed to cuda but the other part of the network are on the cuda)

So if I want to customize a conv layer’s behavior and make it part of the graph, what should I do or what did I missed?

Thanks for your help.

Update: by manually set the weight to cuda I manage to run the code, but I still need to confirm with someone that this approach is valid and working.

def conv2d_forward(self, input, weight):
    device = weight.device
    new_weight = weight * 10
    new_weight =
    return F.conv2d(input, new_weight, self.bias, self.stride,
                        self.padding, self.dilation, self.groups)

I also print out the weight and it is updating

  1. I think u don’t have to manually set weight to cuda if the input of the argument weight is something already in cuda.
  2. U don’t have to inherit _ConvNd, u can just inherit nn.Module or just write it as an ordinary Python function.
  3. I think ur code should work, and I’m not from Pytorch team :slight_smile: .

Thanks for your reply.
I am a bit confused that how you can interact with the weight by inherit from nn.module?
can you share some simple code as hint?


class MyConv(nn.Conv1d):
    def __init__(self, *args):
        self.bias.normal_(0, 1)
        #both "weight" and "bias" are Parameters from "nn.Conv1d"
        #and they all requires_grad

thanks, I will give a try on this and test it. Thank you!