Hi,
Suppose we have a vocabulary V = V_global + V_instance_specific. V_global is fixed and V_instance_specific changes for each instance. Now, I want to get a softmax over V for each instance and use it in the loss function.
How can I do that?
Hi,
Suppose we have a vocabulary V = V_global + V_instance_specific. V_global is fixed and V_instance_specific changes for each instance. Now, I want to get a softmax over V for each instance and use it in the loss function.
How can I do that?
you can always dynamically concatenate the arrays of sizes V_global, V_instance_specific
to get V_global + V_instance_specific
sized array and use F.log_softmax() + NLLLoss()
[https://pytorch.org/docs/stable/nn.html#torch.nn.functional.softmax] dynamically.
Thanks for your response.
But the size of V_instance_specific is not fixed. NLLLoss() typically operates on a fixed length vector. Isn’t it?
I think, NLLLoss()
is just a function to calculate dot product between given log likelihood values and a one-hot encoding of desired class. It can work with any length input dynamically.