Kernel size of max pooling


(Carsten Ditzel) #1

I want to maxpool2d over an input of shape 1x512x60x80, i.e. I want to end up with a tensor of shape 1x512x1x1. Is it correct to use

   x = torch::max_pool2d(x,{60, 80}); 

and to define the non-square kernel size with the initializer-list syntax?

thanks in advance


(Juan F Montesinos) #2

You can just use adaptativr max pooling


(Carsten Ditzel) #3

Are you referring to this ?

https://pytorch.org/docs/stable/_modules/torch/nn/modules/pooling.html

I am wondering why there is no error or exception caught, if I try to standard max pool over a rectangular input with a non-fitting square kernel…