# Tensor global max pooling and average

hi, i am a beginner of pytorch.
The problem is i have 16 tensors(each size is 14 * 14), and how could i use global max pooling and then calculate the average value of every 4 tensors, and return an output tensor with 4 * 1.
Thanks.

You could use an adaptive pooling layer first and then calculate the average using a `view` on the result:

``````x = torch.randn(16, 14, 14)

# Calculate result manually to compare results
out_manual = torch.stack([out[:, i:i+4].mean() for i in range(0, 16, 4)])

out = out.view(out.size(0), out.size(1)//4, -1)
out = out.mean(2)

print(torch.allclose(out_manual, out))
> True
``````
1 Like thanks. i figure it out

1 Like

@ptrblck Do you know if `F.adaptive_avg_pool*` default to simply calling `torch.mean` if the output size is 1? (I want to use `adaptive_avg_pool*` for convenience but a bit hesitant because of potential overhead)

Yes, it should dispatch into `mean`.
If I remember it correctly, this call was missing for `memory_format=torch.channels_last` workloads and was added before 1.7, so I would recommend to use the latest PyTorch version (as always ).

Hi @ptrblck, is this the same as the TensorFlows’ GlobalMaxPooling?

Yes, I think that’s the case. Based on the docs in TF it seems you might need to add the `keedims=True` argument to get the same results and shape.

I see. What if I want a `2D array` as a result instead of the same shape (`3D` in my case)?
Afaik, TF’s `GlobalMaxPooling` (if `keepdims=False`, which is so by default) outputs a `2D tensor`.

You could `squeeze` the spatial dimension via:

``````pool = nn.AdaptiveMaxPool2d(output_size=1)

x = torch.randn(2, 3, 24, 24)
out =pool(x)
print(out.shape)
> torch.Size([2, 3, 1, 1])
out = out.squeeze(3).squeeze(2)
print(out.shape)
> torch.Size([2, 3])
``````
1 Like