How to change the dilation of a convolutional layer in training phase

(王智寬) #1

I would like to change the dilation during training mode. In other words, I’d like to share weights between two conv layers with different dilation. Take following sudo code for example:

init:
    conv1 = 3x3 kernel with dilation (1,1)
    conv1_dilation = conv1 with dilation (2,2)
forward(X1):
    R1 = conv1(X1)
    R2 = conv1_dilat(X1)
    return R1,R2

where conv1 and conv1_dilat have the same kernel size and value but different dilation.
Could anyone tell me how to implement this?

(Ikki Kishida) #2

This is the code:

import torch
from torch.nn import Conv2d
import torch.nn.functional as F

conv = Conv2d(3,10,3,1,1)
x = torch.Tensor(torch.rand(1,3,10,10))
R1 = conv(x)
R2 = F.conv2d(x, weight=conv.weight, bias=conv.bias, stride=conv.stride, padding=conv.padding, dilation=(2, 2), groups=conv.groups)
1 Like