A bug from torch.nn.ConstantPad1d

Today I use the torch.nn.ConstantPad1d as follow:

import torch

layer = torch.nn.ConstantPad1d((2, 2, 2, 2), 0)

data = torch.randn(1, 3, 224)

result = layer(data)

print(result.shape) # 1, 7, 228

I think this is the wrong way to use it.

Why does it work?

This is because all of the ConstantPadNd functions call torch.nn.functional.pad under the hood without checking the length of the padding, and torch.nn.functional.pad doesn’t have any information about what the nn.Module’s dimensionality is. This might be sorta in the “it’s a feature not a bug” category as it is not technically doing an “invalid” computation:

>>> import torch
>>> a = torch.randn(1, 3, 224)
>>> torch.nn.functional.pad(a, (2,2,2,2)).shape
torch.Size([1, 7, 228])
>>> torch.nn.functional.pad(a, (2,2)).shape
torch.Size([1, 3, 228])
>>> torch.nn.functional.pad(a, (2,2,3,3)).shape
torch.Size([1, 9, 228])
>>> torch.nn.functional.pad(a, (2,2,3,3,6,6)).shape
torch.Size([13, 9, 228])
>>> torch.nn.functional.pad(a, (2,2,3,3,6,6,8,8)).shape
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
RuntimeError: Padding length too large

Thank you for your reply :grin:. Maybe there is no limit dimension for torch.nn.functional with the mode='constant', other patterns are dimensionally constrained,like reflect or replicate. :rofl: