Softmax over H * W

I want to do softmax over H * W, but the official Softmax2d do softmax with C * H * W.
So I do this by using torch.sum like this,

>>> a = torch.rand(2, 1, 3, 3)
>>> a

(0 ,0 ,.,.) =
  0.2937  0.7227  0.8050
  0.1798  0.3219  0.9238
  0.1541  0.4280  0.3620

(1 ,0 ,.,.) =
  0.1013  0.3864  0.6033
  0.2719  0.9204  0.4946
  0.2204  0.7962  0.8410
[torch.FloatTensor of size 2x1x3x3]

>>> aa = a.div(torch.sum(torch.sum(a, dim=2, keepdim=True), dim=3, keepdim=True))
>>> aa

(0 ,0 ,.,.) =
  0.0701  0.1724  0.1921
  0.0429  0.0768  0.2204
  0.0368  0.1021  0.0864

(1 ,0 ,.,.) =
  0.0219  0.0834  0.1301
  0.0586  0.1986  0.1067
  0.0475  0.1718  0.1814
[torch.FloatTensor of size 2x1x3x3]

I want to know is there a more efficient or concise way to do this?

How about using .view to feed it into a stock softmax?

Best regards

Thomas

could you please elaborate it?

torch.softmax(x.reshape(b, c, h * w), -1).view(b, c, h, w)

1 Like