Hi,I try to use functional.linear to custom a simple layer, the weight parameter is combined with two parameters, then the output loss is always nan,I don’t kown how can I sovle this problem or I can’t do that way.
my code is:
class Linear(nn.Module):
def init(self, in_dim,out_dim):
super(Linear, self).init()
self.weight1 = nn.Parameter(torch.ones(in_dim,out_dim))
self.weight2=nn.Parameter(torch.randn(out_dim,in_dim))
Thanks so much. I also need to find the source code of backward function in conv2d , so I can custom my own conv2d layer , but I can’t find it. Can you show me how to solve the conv2d backward process in pytorch?