Best way to implement dilation (without convolution)?

I am trying to make a function/module that takes an array of some shape and pads all pixels by a given amount:

a = torch.arange(1, 5).expand(4, -1)
print(a)

b = dilate(a)
print(b)

Output:

tensor([[1, 2, 3, 4],
        [1, 2, 3, 4],
        [1, 2, 3, 4],
        [1, 2, 3, 4]])

tensor([[0, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 1, 0, 2, 0, 3, 0, 4, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 1, 0, 2, 0, 3, 0, 4, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 1, 0, 2, 0, 3, 0, 4, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 0],
        [0, 1, 0, 2, 0, 3, 0, 4, 0],
        [0, 0, 0, 0, 0, 0, 0, 0, 0])

I don’t believe this functionality exists in pytorch? How would I best implement this without “for” loops?

Hi, I don’t know whether there is a better way to do this, but you could do a transposed convolution (torch.nn.functional.conv_transpose2d) with a 1x1 kernel were there is a 1 in the 1x1 kernel. Via stride you can then choose how many zeros you want in between your pixels. For the padding at the edges, you could then use the pad function. So you could do something like this here:

import torch
import matplotlib.pyplot as plt


def pad_zeros_in_between(input_, num_zeros_in_between=1):
    weight = torch.ones((1, 1, 1, 1))
    out = torch.nn.functional.conv_transpose2d(input_, weight, stride=num_zeros_in_between+1)
    out = torch.nn.functional.pad(out, [num_zeros_in_between for i in range(4)])

    return out


input_ = torch.linspace(start=1, end=4, steps=4)
input_ = torch.stack([input_ for i in range(4)])
input_ = torch.unsqueeze(torch.unsqueeze(input_, dim=0), dim=0)

out = pad_zeros_in_between(input_, num_zeros_in_between=1)
    
print(input_)
print(out)

plt.figure()
plt.imshow(torch.squeeze(input_))

plt.figure()
plt.imshow(torch.squeeze(out))

Hehe I guess that is one way to do it!

I am kind of trying to “re-make” the conv_transpose, but using the actual function in my “new” definition for this part won’t hurt :slight_smile:

Well done, thanks!