Hey guys,
I was wondering, how do I softmax the weights of a torch Parameter? I want to the weight my variables A and B using softmaxed weights as shown in the code below.
class RandomClass(torch.nn.module):
def __init__():
...
self._weights = torch.nn.Parameter(0.5*torch.ones(2), requires_grad=True)
...
def forward():
# --- incorrect code to softmax parameter - tried this, doesn't work ---
self._weights.data = torch.nn.functional.softmax(self._weights.data)
# --- incorrect code to softmax parameter - tried this, doesn't work ---
argument_scores = (self._weights[0]*A) + (self._weights[1]*B)