How to pad one side in pytorch

By default pytorch only support padding on both side, but for example, I have a feature of 1x512x37x56(NCHW) and I want to pad on one side to 1x512x38x57, how can I do it?

1 Like

EDIT
Hmm, seems they added a lot of padding classes since I looked last time and this is not needed anymore…
This page has all the padding you might want :slight_smile:


Padding, whilst copying the values of the tensor is doable with the Functional interface of PyTorch.
You can read more about the different padding modes here.

import torch.nn.functional as F

# Pad last 2 dimensions of tensor with (0, 1) -> Adds extra column/row to the right and bottom, whilst copying the values of the current last column/row
padded_tensor = F.pad(input_tensor, (0,1,0,1), mode='replicate')

You can wrap this functional interface in a module:

import torch
import torch.nn.functional as F

class CustomPad(torch.nn.module):
  def __init__(self, padding):
    self.padding = padding
  def forward(self, x):
    return F.pad(x. self.padding, mode='replicate')
3 Likes

Thank you! I also found nn.ConstantPad2d can do the job

4 Likes

Here is an example how I have used padding! Please check on forward() method.

class Up (nn.Module):
    def __init__(self, in_ch, out_ch, bilinear= True):
        super(self, Up).__init__()
# bilinear is the upsample algorithm given torch.nn.Up algo.
#check more here: https://s0pytorch0org.icopy.site/docs/0.4.0/_modules/torch/nn/modules/upsampling.html
        if bilinear:
            #align_corners (bool, optional): if True, the corner pixels of the input
            #and output tensors are aligned, and thus preserving the values at
            #those pixels. This only has effect when :attr:`mode` is `linear`,
            #`bilinear`, or `trilinear`. Default: False
            self.up = nn.Upsample(scale_factor=2, mode='bilinear', align_corners=True)

            # Bilinear vs ConvTrabspose2d
            #ConvTranspose is a convolution and has trainable kernels 
            # while Upsample is a simple interpolation (bilinear, nearest etc.)
        else:
            self.up = nn.ConvTranspose2d(in_ch//2, in_ch//2, 2, stride=2)
        
        def forward(self, x1, x2= None):
            x1 = self.up(x1)

            # input in channel height and width

            
            #x2 tensor shape = [batch, channel, H, W]
            diffy = x2.size()[2] - x1.size()[2]
            diffx = x2.size()[3] -x1.size()[3]

            # add padding --PADDING ADDED HERE 
            x1 = F.pad(x1, (diffX // 2, diffX - diffX//2,
                        diffY // 2, diffY - diffY//2))
    
        
            if x2 is not None:
                x = torch.cat([x2, x1], dim=1)
            else:
                x = x1
            x = self.conv(x)
            return x

1 Like