Can I call softmax for all tensor elements?

Currently you need to specify softmax dimension as dim. This is nice, and it will complain if you don’t set the dim. Why are the dimensions -1 and -2 used for?

I expected when dim=-1 this should do softmax on whole tensor. Is this possible?

Would be interesting to know the same quest for the log_softmax also?

Don’t know if softmax directly but you can always reshape go and back.

Thanks, reshaping the tensor definitely works, but I just wanted to check if this is the only way.

I personally prefer explicit code (as long as it doesn’t add a huge overhead) and in this case is pretty readable:

data = data.view(-1).softmax(0).view(*data.shape)

It is even simpler if you don’t care about recovering the previous dimensions:

data = data.view(-1).softmax(0)

Also, dim=-1 refers to last dimension, which is equivalent to dim=data.ndimension()-1.

2 Likes

:heavy_check_mark: for the last dimension tip. I forgot that.