This question might be naive but qusetioned me for a long while. If we wrap a convolution function by a custom function, and pass the parameters, like this
my_w = nn.Parameter(torch.randn(1,1,3,3))
my_b = nn.Parameter(torch.randn(1,1,1))
def conv1(x, my_w, my_b):
x = F.conv2d(x, weight=my_w, bias=my_b, stride, padding)
return x
Can my_w and my_b get updated? as they are global parameters, they do not really live in this function
If can, why? If cant, how to fix it(without ‘global’ tag)?
Thank you guys!