The outputs of Dropout are scaled by 1 / (1 - p) to make the layer indentity during testing. Is it same case for Dropout2d?
Yes, seems to be the case as seen here:
drop = nn.Dropout2d()
x = torch.ones(2, 2, 2, 2)
out = drop(x)
print(out)
> tensor([[[[0., 0.],
[0., 0.]],
[[0., 0.],
[0., 0.]]],
[[[2., 2.],
[2., 2.]],
[[0., 0.],
[0., 0.]]]])
1 Like
Thanks. So stupied of me not running some code .