How do I apply softmax function channel wise over convolutional layer. Basically I want add key point block after 4th convolutional block in resnet-18

ResNet-18 consists of four sequential convolution blocks, and the output of the fully-connected (FC) layer following the
last convolution block is used as the global feature global. The output feature
map of a convolution block-l is denoted by Xl ∈ R
C×W×H.
Then I have to add a local branch called keypoint block, which has an architecture similar to a convolution
block, to localize the distribution of key points. So I have to apply channel-wise softmax on the output
feature map of the keypoint layer to estimate the density of a key point over different image locations.
This softmax output is used as a channel-wise keypoint mask, which will allow
me to perform element-wise product of Xl and Ml. The resulting local feature f of block-l is calculated by a channel-wise summation over locations. But I do not how I can do this pytorch. Can anyone please help me.

If you hadn’t tagged ptrblck, I would have recommended using .reshape before and after the softmax.

Best regards

Thomas

P.S.: More seriously, it’s not recommended to tag people for general questions. You never know if someone who you didn’t tag would have great advice and is discouraged by not being on your list.

2 Likes

Thanks and sorry. I have edited my qtn and removed the tag. Sorry this is the 1st time when I have asked qtn. Sorry again.

Will F.softmax(x, 1) will do my job for channel wise softmax?

Let’s say you have N C H W tensors.
If you mean channel wise as in “for each pixel, a probability distribution over the channels”, then F.softmax(x, 1) is for you.
If you want “for each channel, a probability distribution over the pixels”, you should use F.softmax(x.reshape(x.size(0), x.size(1), -1), 2).view_as(x) instead. So you reshape to merge the last two dimensions, take the softmax over that, and reshape to match the input (for the latter, you know that view_as will work).

Best regards

Thomas

9 Likes

Thank you so much, now I get it.