Change layer size

I try to change the size of a linear model by adding one new neuron after few epochs.

net = torch.nn.Linear(4, 4)
optimizer = torch.optim.SGD(net.parameters(), lr=0.01)
#training here ...
net.weight = torch.nn.Parameter(torch.cat((backSave.weight, torch.randn(1, 4)), 0))
net.bias = torch.nn.Parameter(torch.cat((backSave.bias, torch.randn(1)), 0))
#training here again but the weights do not change anymore

However, when I then train the model again after updating the size, the model weights no longer change. The model is not trained anymore.
I found a solution, I have to recreate the optimizer after the modification of the size

optimizer = torch.optim.SGD(net.parameters(), lr=0.01)

But I do not like this solution, is there another solution without defining again the optimizer
Thank you

I dont think you can change the size of your model during training. The optimizer is set to update a specific set of parameters. When you add new parameters, the optimizer dont know these exists, and thus needs to be reinitialized with the newly added parameters in order to work again.

I understand that the problem is that I redefine the parameter so I have to register it again to the optimizer.
Is it possible to resize a layer without redefining its parameters ? I tried to use resize_ but it did not work because the parameter reque.

I’ve found another guy who does exactly what you want to do (I think). And he also have to re-initialize the optimizer every time. Remember to resize bias as well not only weights.

He is resizing weights during training: resizing weights during training

However, as seen in the comments, you cannot only resize_ because you have to copy and paste the old weights over. Its computationally heavy but should work in your case as well. Hope it helps!