code:
class cnn(nn.Module):
def init(self):
super(cnn, self).init()
self.conv1 = nn.Conv2d(3, 5, 3, 1, 1)
def forward(self, x):
x = self.conv1(x)
return x
model = cnn()
model.conv1.weight.data = torch.cat([model.conv1.weight.data, torch.rand([1, 3, 3, 3])], dim=0)
model.conv1.bias.data = torch.cat([model.conv1.bias.data, torch.ones(1, dtype=torch.float32)], dim=0)
model.conv1.out_channels = 6
I can get the correct output shape by y=model(x),
but I meet some error in backward:RuntimeError: Function ThnnConv2DBackward returned an invalid gradient at index 1 - got [6, 3, 3, 3] but expected shape compatible with [5, 3, 3, 3]
I also find some relative Q&A, it’s seem like autograd can’t get the modify in “weights.data” and “bias.data”, but I want to keep the original information, only add convolution core.What should I do?