Custom CNN Layers use torch.nn.functional

Hi,I try to use functional.linear to custom a simple layer, the weight parameter is combined with two parameters, then the output loss is always nan,I don’t kown how can I sovle this problem or I can’t do that way.
my code is:
class Linear(nn.Module):
def init(self, in_dim,out_dim):
super(Linear, self).init()
self.weight1 = nn.Parameter(torch.ones(in_dim,out_dim))

def forward(self, x):
    return F.linear(x,,self.weight2))

Your code look good. It won’t return a tensor of the shape [batch_size, out_dim], so I’ve changed your Module a bit to give the desired output.

class MyLinear(nn.Module):
    def __init__(self, in_dim, h, out_dim):
        super(MyLinear, self).__init__()
        self.weight1 = nn.Parameter(torch.ones(out_dim, h))
        self.weight2= nn.Parameter(torch.randn(h, in_dim))

    def forward(self, x):
        return F.linear(x,,self.weight2))

lin = MyLinear(in_dim=10, h=4, out_dim=2)

x = torch.randn(4, 10)
output = lin(x)

The output and gradients look good. Could you give a small code example resulting in the NaN loss?

1 Like

yes,thanks very much. I have modified my code. And I found a webpage link which can help custom a layer.

Thanks so much. I also need to find the source code of backward function in conv2d , so I can custom my own conv2d layer , but I can’t find it. Can you show me how to solve the conv2d backward process in pytorch?