Torch.autograd.grad and tensor slicing - pytorch 0.4.1

Here is code in pytorch 0.4.1

import torch
from torch.autograd import grad
x = torch.ones(10,2, dtype=torch.float, requires_grad=True)
w = torch.rand(10,2, dtype=torch.float, requires_grad=True)
y=x*w
y0=y[:,0]
x0=x[:,0]
gy0=torch.ones(10, dtype=torch.float)
 
#test1:
#this gives none
#g0=grad(y0, x0, gy0, allow_unused=True)

#test2:
#this works, g0 is not none
g0=grad(y0, x, gy0)

if I remove allow_unused=True, then the error shows:
RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior

OK… it is not bug…

Here is the graph
x0<- x -> x*w ->y ->y0

Yes exactly !
There is no link between x0 and y0 in that case.

1 Like