'Same' convolution in pytorch

Hi,

Is there a simple way to do ‘same’ convolution in pytorch such that the output spacial dimensions equals the input counterpart ?

Best

1 Like

Use the padding parameter. The formula for the output size is given in the shape section at the bottom of the torch.nn.Conv2d documentation. So e.g. k//2 for odd kernel sizes k with default stride and dilation.

Best regards

Thomas

3 Likes

There are formulas for this. You can see a worked out example here:

also here is a reference for arithmetic of convolutions:


Though I agree that this so staandard that Pytorch should just have a layer or an option that calculates all this once for the user. It seems silly and repetitive that we have to think about this every time we want to use same convolution. Would be nice if the code was already ready and provided for us, specially for standard things like this.


1 Like

I’ve ported weights from several Google AI Tensorflow models (EfficientNet, MixNet, MnasNet, etc). These weights require this type of padding so I created a factory of sorts that allows selecting between a PyTorch symmetric padding that comes close to ‘SAME’ and actually matching the TF ‘SAME’ with asymmetric padding when needed for compatibility.

Note that the Tensorflow compatible padding has a runtime penalty since I implemented it in a way that is dynamic based on input size (as opposed to requiring the input size passed in at model creation time). This pentaly is not worth paying if you don’t need compatibility with TF weights.

Current version of this lives here with the mixed kernel convolutions from MixNet: https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/conv2d_helpers.py

My impl does work well with several TF models so I assume it’s reasonably close to correct. May be some issues/scenarios I haven’t tested well wrt dilations, etc. Any feedback welcome. There are lots of discussion on this out there in various forums or issue trackers. Mistakes in many of those too :slight_smile:

1 Like

Hello. Your link no longer works.