cannot assign 'torch.cuda.FloatTensor' as parameter 'mask_weight' (torch.nn.Parameter or None expected)
One more question/obervation:
if I use the code self.mask_weight.data = self.mask_weight.data*temp, there is no error,
however, self.mask_weight will not be optimized during the training.
is there any good syntax to do this?
An nn.Parameter is something that the network learns (by doing SGD, for instance). This is probably why an error is reported when you try to set its value yourself.
Do you want the network to learn SMConv2d.mask_weight on its own, or do you want to be able to set it yourself? I don’t think you can have it both ways (but I may be wrong).
I don’t know for you sure, but I think this would not be possible. I think (again, I am not sure) that if you want some parameter to be optimized by the training, then you cannot also change it in arbitrary ways outside of the training.