Hello,
I have declared two nn.Parameter()
variables with requires_grad=True
and I am using those in a different function that’s being called inside the init method of the class where variables are declared. lparam
and rparam
are not getting updated
My question is am I doing it the right way?
if not how it should be done?
here is the code example:
class LG(BaseNetwork):
def __init__(self, opt):
super().__init__()
self.opt = opt
self.lparam = nn.Parameter(torch.zeros(1), requires_grad=True).cuda(device=opt.gpu_ids[0])
self.rparam = nn.Parameter(torch.zeros(1), requires_grad=True).cuda(device=opt.gpu_ids[0])
def foo(self, a, b, k=1.0, lparam=0, rparam=0):
t = bar(a, b, k=k, lparam=lparam, rparam=rparam)
return t
def forward(self, a, b):
x = self.foo(a, b, k=self.opt.k, lparam=self.lparam, rparam=self.rparam)
return x
BaseNetwork is just initializing functions and uses nn.Module
def bar(a, b, k=1.0, lparam=0, rparam=0):
return n(a) * (b.std() * (k * lparam)) + (b.mean() * (k * rparam))
When I print the named params I can not get lparam
and rparam