How to perform a more flexible global pooling?

I know how to perform pooling between batches with different scales.(The images in one batch must have the same scale, but in different batches, the scale can be different.)
But if I want to perform the pooling operation in a batch with different scale images, how should I do?

See nn.AdaptiveMaxPooling: http://pytorch.org/docs/nn.html#adaptivemaxpool2d

Thanks!
I have tried it, but it requires the input images in the same batch must have the same shape.
I want to keep the images’ original shape without scale operation, so the images in a batch have different shapes, and it seems no corresponding solution.

You can call the functional avg_pool2d and specify the kernel size per call, which you can extract from your input tensor size… ie F.avg_pool2d(x, kernel_size=(x.size(2), x.size(3))