Quick question on leaf tensors

Lets say I have a pytorch variable as follows

p=Variable(torch.randn(800,3),requires_grad=True)

Lets also say that I have a loss function L. But I would like to back propagate L wrt p[0] rather than with respect to p.

However, when I go ahead and execute the following statements

p.is_leaf and p[0].is_leaf, the latter is false and the former is True. But now since p[0] is not a leaf, How do I back propagate and update p[0] ? Will p[0]=p[0].detach() work?

Hi,

Just a side note, you don’t need Variables anymore, you can just do p = torch.randn(800, 3, requires_grad=True).

The simplest thing you can do here, is just backprop on p and get the gradient as p.grad[0]. Will that work for you?

Hi Alban,

Thank for the reply. I don’t think that would work for me as the constraints are such that the forward propagation involves p[0] rather than the entire p matrix and hence p is not directly related in the forward propagation. Do you think there is a way out?

I would be needing the updated value of p[0] rather than the gradient of L wrt p[0]