Hi all,
I’m doing a project and I’d like to freeze some weight and only update the others during training.
I wish I could use hook()
and register_forward_pre_hook
to achieve this since I’m learning using hook now. Below is my codes:
def hook(self, model, inputs):
with torch.no_grad():
model.weight = model.weight * self.sparse_mask[self.type[model]]
def register_hook(self,module):
self.handle = module.register_forward_pre_hook(self.hook)
However, I got this error
TypeError: cannot assign ‘torch.FloatTensor’ as parameter ‘weight’ (torch.nn.Parameter or None expected)
I’v lost a day fixing this problem, I was wondering if there’s someone could help me. Welcome with any ideas
Thanks!