Nn.parameter re-define question

Hi,
Please check my sampler code as follows.
my questions are:

  1. self.mask = self.mask_weight*temp, by this line, could I keep self.mask_weght and self.mask are in the same memory? will they be upated in sync?
  2. why could I not be able to self.mask_weight = self.mask_weight * temp to re-assign the parameter?
  3. does it mean that nn.parameter could only be defined once, and could not be re-defined?
class SMConv2d(nn.Module):
    def __init__(self, in_channels, out_channels, kernel_size, padding=1, stride=1):
        super(SMConv2d, self).__init__()
    
        self.mask_weight = nn.Parameter(torch.Tensor(out_channels, in_channels, kernel_size, kernel_size))
        nn.init.constant_(self.mask_weight, 1)
    
    def compute_mask(self,temp):
        self.mask = self.mask_weight*temp
        # self.mask_weight = self.mask_weight * temp
       
        return self.mask 

    def forward(self, x, temp=1):
        masked_weight = self.compute_mask(temp)
        out = F.conv2d(x, masked_weight, stride=self.stride, padding=self.padding)        
        return out

What is the error you got when you tried to do this?

Hi,
@gphilip

I got the following error:

cannot assign 'torch.cuda.FloatTensor' as parameter 'mask_weight' (torch.nn.Parameter or None expected)

One more question/obervation:
if I use the code self.mask_weight.data = self.mask_weight.data*temp, there is no error,
however, self.mask_weight will not be optimized during the training.
is there any good syntax to do this?

An nn.Parameter is something that the network learns (by doing SGD, for instance). This is probably why an error is reported when you try to set its value yourself.

Do you want the network to learn SMConv2d.mask_weight on its own, or do you want to be able to set it yourself? I don’t think you can have it both ways (but I may be wrong).

I don’t know for you sure, but I think this would not be possible. I think (again, I am not sure) that if you want some parameter to be optimized by the training, then you cannot also change it in arbitrary ways outside of the training.

@gphilip
Thank you!

I agree with what you said. however, you said “but I may be wrong”, I would like to further confirm/make the thing clear.

accoring to what you said, nn.parameter could be defined, initialized, and referred, and could not be re-asigined.

Is there any official explaination/doc about my question?
are you pytorch official developer?

I am not.

Did you try asking Google?

I have roughly asked Google, however, Google didn’t answer such a specific quesiton.