Backward (forward) difference on a tensor

Is there any efficient way to compute forward (f_i+1,j - f_i,j ) or backward (f_i,j - f_i-1,j ) differences on the tensor, for example, x = torch.rand(Bs, Nc ,Nx, Ny), where Bs - batch size, Nc-number of channels, Nx, Ny - dimensions in X and Y direction?

You can either express these as convolutions with the stencils as weights (and number of groups=number of hannels) or do the slicing yourself e.g. dx = x[:,:,1:]-x[:,:,:-1]. This generally will give you the differences. To make them forward or backward, you would (zero-?) pad at the beginning or the end.

Best regards

Thomas

Thanks Thomas,
Here is the code which hopefully does the job.

#Compute forward differences in x and y direction. Use zero-padding.
x = torch.rand(2, 3,2,2)
p2d1 = (0, 0, 0, 1) #zero pad in y direction
p2d2 = (0, 1, 0, 0) #zero pad in x direction

diff1 = x[:,:,1:].contiguous() #remove first row
diff1=F.pad(diff1, p2d1, ‘constant’, 0) # add row of zeros to the bottom
dy = diff1-x #derivative in y direction

diff2 = x[:,:,:,1:].contiguous() #remove first column
diff2=F.pad(diff2, p2d2, ‘constant’, 0) # add column of zeros to the right
dx = diff2-x #derivative in x direction