I would like to change the dilation during training mode. In other words, I’d like to share weights between two conv layers with different dilation. Take following sudo code for example:

```
init:
conv1 = 3x3 kernel with dilation (1,1)
conv1_dilation = conv1 with dilation (2,2)
forward(X1):
R1 = conv1(X1)
R2 = conv1_dilat(X1)
return R1,R2
```

where conv1 and conv1_dilat have the same kernel size and value but different dilation.
Could anyone tell me how to implement this?

nutszebra
(Ikki Kishida)
May 14, 2019, 10:36am
#2
This is the code:

```
import torch
from torch.nn import Conv2d
import torch.nn.functional as F
conv = Conv2d(3,10,3,1,1)
x = torch.Tensor(torch.rand(1,3,10,10))
R1 = conv(x)
R2 = F.conv2d(x, weight=conv.weight, bias=conv.bias, stride=conv.stride, padding=conv.padding, dilation=(2, 2), groups=conv.groups)
```

4 Likes

Hi all!

I have one question. Is there any function to replace dilation rate in conv2d ?

You could change it directly by assigning a new value to it:

```
conv = nn.Conv2d(1, 6, 3, 1, 1, 1)
x = torch.randn(1, 1, 24, 24)
out = conv(x)
print(out.shape)
conv.dilation = (2, 2)
out = conv(x)
print(out.shape)
```

3 Likes