I just tested this myself, and this is strange, because the here they say, that padding is applied before the actual convolution.
And torch.nn,ConstantPad2d
accepts 4 values to specify the padding. As a workaround you could use the following:
class CustomConv(torch.nn.Module):
def __init__(self, in_channels, out_channels, kernel_size, stride=1, padding=0, **kwargs):
super().__init__()
self.conv = torch.nn.Conv2d(in_channels, out_channels, kernel_size, stride, padding=0, **kwargs)
self.pad = torch.nn.ConstantPad2d(padding, 0)
def forward(self, input_tensor):
return self.conv(self.pad(input_tensor))
I’ll try to find out, why this is not supported…