# Softmax Across 3D tensor

I have a tensor:

`A = torch.randn(B,C,X,Y,Z)`

I would like to perform a softmax activation over the channels `C`.

What I hope to achieve is that the sum of every non-zero element over channels `C` is equal to one.

I have the softmax function, which operates over some dimension. In my case, I would imagine that I use `dim=1`, if I wanted it over the channels.

``````sm = torch.nn.Softmax(dim=1)
``````

Will this achieve the intended result?

Hi,

Yes, it will.
For instance, if you have `[1, 3, 2, 2, 2]` tensor, then validating softmax value at position `[0, 0, 0, 0, 0]`, you need following calculation:

``````torch.exp(x[0, 0, 0, 0, 0])/ (torch.exp(x[0, 0, 0, 0, 0])+torch.exp(x[0, 1, 0, 0, 0])+torch.exp(x[0, 2, 0, 0, 0]))
``````

Bests