How to softmax Weights of torch.nn.Parameter

Hey guys,

I was wondering, how do I softmax the weights of a torch Parameter? I want to the weight my variables A and B using softmaxed weights as shown in the code below.

class RandomClass(torch.nn.module):
     def __init__():
          ...
          self._weights = torch.nn.Parameter(0.5*torch.ones(2), requires_grad=True)
          ...

     def forward():
         # --- incorrect code to softmax parameter - tried this, doesn't work ---
         self._weights.data = torch.nn.functional.softmax(self._weights.data)
         # --- incorrect code to softmax parameter - tried this, doesn't work ---

         argument_scores = (self._weights[0]*A) + (self._weights[1]*B) 

Hi,

You should recompute the value in the forward but without trying to override the original Parameter:

class RandomClass(torch.nn.module):
     def __init__():
          ...
          self._weights = torch.nn.Parameter(0.5*torch.ones(2), requires_grad=True)
          ...

     def forward():
         softmaxed_weights = torch.nn.functional.softmax(self._weights)

         argument_scores = (softmaxed_weights[0]*A) + (softmaxed_weights[1]*B) 

Also note that you should never use .data in general as it most likely means that your code will lead to more bugs.

1 Like

It worked, thank you! :smile: Good point on the .data, will keep that in mind.