Why there is no global pooling in Pytorch Framework?

Hello! Why there is no global pooling in Pytorch Framework.
I just notice there are normal pooling method like Maxpool or Avgpool
but no Global pool there, why?~

And if you may, how does global pooling works actually, I just saw it in the paper about CNN, but a little bit confuse about its backend mechanism:hushed::hushed::hushed::sleepy::sleepy::sleepy:

1 Like

use torch.mean or torch.max operator

1 Like

As @smth says, you can use torch,mean for global average pooling. Another way is by torch.nn.AvgPool2d(kernel_size=feature_size). This way you need to find out the size of the input features, and apply pooling with kernel size as big as your feature size.

2 Likes

thx smth:+1::+1::+1::+1:

thx vmirly:+1::+1::+1::+1: