I have a tensor:
A = torch.randn(B,C,X,Y,Z)
I would like to perform a softmax activation over the channels C
.
What I hope to achieve is that the sum of every non-zero element over channels C
is equal to one.
I have the softmax function, which operates over some dimension. In my case, I would imagine that I use dim=1
, if I wanted it over the channels.
sm = torch.nn.Softmax(dim=1)
Will this achieve the intended result?