Lets say I have a pytorch variable as follows
Lets also say that I have a loss function L. But I would like to back propagate L wrt p rather than with respect to p.
However, when I go ahead and execute the following statements
p.is_leaf and p.is_leaf, the latter is false and the former is True. But now since p is not a leaf, How do I back propagate and update p ? Will p=p.detach() work?
Just a side note, you don’t need
Variables anymore, you can just do
p = torch.randn(800, 3, requires_grad=True).
The simplest thing you can do here, is just backprop on p and get the gradient as
p.grad. Will that work for you?
Thank for the reply. I don’t think that would work for me as the constraints are such that the forward propagation involves p rather than the entire p matrix and hence p is not directly related in the forward propagation. Do you think there is a way out?
I would be needing the updated value of p rather than the gradient of L wrt p