Hey @davidweb,
so I am not a 100% sure on this, but as soon as you call remove()
, the zero weights will get updates again.
Have you tried the following:
- You should just pass
model.parameters()
to the optimizer, as usual. - If I understand you correctly, you don’t want the Linear modules to be updated during finetuning, right? Just continue training the full model, but freeze the layers that you don’t want to be updated, check for example this link.
I think this should do what you want. As long as you do not remove the pruning, pruned weights won’t get updated. Freezing allows you to disable certain layers of your network. Let me know whether this fixes your issue.