Create leaf variable from tensor

I have created a custom nn.Module(). In the initializer I want to initialize a leaf variable with a tensor. However, when optimizing, I get the following error: ValueError: can't optimize a non-leaf Variable.
My code looks like follows:

x = torch.zeros(<some format>).cuda()
x[:, :, :, int(np.floor(self.number_of_params/2))] = 1
self.y = Variable(x, requires_grad=True)

I also tried to use x.clone() or creating a Variable out of x and using x.data when creating y.
Why is y not a leaf variable?

If this is done inside an nn.Module, your should create y as self.y = nn.Parameter(x) which is basically a Variable that requires grad. But then the module is aware that this is a learnable parameter.

1 Like

Thanks! This works. I see that your way is better, but why does the other way cause problems? Would this also cause problems, if it was not inside a nn.Module()?

This won’t work outside of an nn.Module, but if you do no use nn.Parameter inside, it will not appear when you do .parameters() for example. This may have caused your issue.

The Module is just a submodule of my whole network. I joined the parameters with self.y using intertools.join when giving the parameters to the optimizer. This is ugly but it was just to test a idea really fast. So I am still confused why my original way did not work.