torch.nn.MaxPool3d returns junk indices

MaxPool3d(return_indices=True) is completely messed up, both on CPU and GPU.

I mentioned this in a previous post, but it’s not clear to me that the problem is being addressed.

Here is an example:

pool3d = nn.MaxPool3d(kernel_size=2,stride=2,return_indices=True)
img3d = Variable(torch.rand(1,1,4,4,4))
out, indices = pool3d(img3d)

The output looks like this:

Variable containing:
(0 ,0 ,0 ,.,.) = 
  4.6117e+18  4.6117e+18
  4.2950e+09  2.3224e+18

(0 ,0 ,1 ,.,.) = 
  4.2950e+09  4.2950e+09
 -8.9845e+18  4.2950e+09
[torch.LongTensor of size 1x1x2x2x2]

The elements of indices should be in the range [0,63], so this can’t be right.

This sounds like a bug. Can you open an issue in pytorch?

Okay. I opened an issue here: