Hi, I am a new user to pytorch. I used group lasso and wanted to zero out the small values of weight matrix after the backprop. But when I tried to do so, the gradient for the zero out operation kind of cancel out itself. I checked some posts in the pytorch forum and found out people would manually set the require gradient option to false.
So now the problem is, I want to zero out small values in the weight matrix, and I also want the weight matrix to update normally through backprop, so that means I can’t set the require gradient option to false, right? I just want to left out the gradient for the zero out operation. Is there a way to do that? Or if you have a better solution?
Here is the code for the zero out operation:
for param in model.parameters():
cond = torch.abs(param.data) < threshold
weight_ = param.data + 0
weight_[cond] = 0
param.data = weight_
Thanks in advance. Help is greatly appreciated.