The value I want to optimize is “L” in the following code. When I execute the following code in the console, SGD optimizer accepts is perfectly fine, but when this code is part of my class, I’m getting the error “Cant optimize a non-leaf variable”.
L_1 = Parameter(torch.randn(dim, dim), requires_grad=True) L_1.data = torch.tril(L_1.data) log_diag = Parameter(torch.diag(L_1.data), requires_grad=True) log_diag.data = torch.exp(log_diag.data) mask = Parameter(torch.diag(torch.ones_like(log_diag.data))) self.L = Parameter((mask.data * torch.diag(log_diag.data) + (1. - mask.data) * L_1.data).cuda(), requires_grad=True)
optim = torch.optim.SGD([{‘params’: custom_model.L, ‘lr’: 1e-3}]) ### gives error when part of class; thus the self. keyword.
optim = torch.optim.SGD([{‘params’: L, ‘lr’: 1e-3}]) ### optimizer accepts it fine
Do you have any ideas?