Backward temperature Softmax implementation

Hello,

Is there any forward/backward implementation of Temperature softmax function ?

softmax = e^(z/T) / sum_i e^(z_i/T)

Thank you

Hi,

I don’t think there is a special one but you can use the regular softmax:

t = 0.1
inp = torch.rand(2, 10)

out = torch.softmax(inp/t, dim=1)
3 Likes

Hi, @albanD,

I am not very sure, but do we need to do inp = inp / t (aka. assign back) first in this case for the later gradient calculation? Or which way is better?

Thanks,
Nick

Hi,

The autograd will differentiate the whole code so you can arrange it whichever way you want. The gradient will be computed the same way :slight_smile:

1 Like