I want to apply softmax to each channel of a tensor and i was thinking the sum of elements for each channel should be one, but it is not like that.
this post shows how to do it for a tensor but in batch-wise manner.
can someone helps me what should i do to apply softmax on each channel and the sum in each channel be 1?
import torch
from torch.autograd import Variable
import torch.nn.functional as F
A =Variable(torch.rand(1,2,3,3))
print(A)
print(F.softmax((A), dim=0).sum())
print(F.softmax((A), dim=1).sum())
Variable containing:
(0 ,0 ,.,.) =
0.5912 0.3723 0.0399
0.6684 0.8080 0.6185
0.1265 0.2973 0.5427(0 ,1 ,.,.) =
0.3595 0.4951 0.2176
0.0471 0.8907 0.7543
0.0262 0.8329 0.6792
[torch.FloatTensor of size 1x2x3x3]Variable containing:
18
[torch.FloatTensor of size 1]Variable containing:
9
[torch.FloatTensor of size 1]