Please help me with this function backward calculation

Hello, I have a relative complex function, which I want use backward()

a = torch.zeros([3,3])
a[0,0] =1.0
a[1,2] =1.0
a[2,1] =1.0
b1 = torch.tensor([[1.0,2.0,3.0],[2.0,3.0,4.0],[3.0, 4.0,5.0]])
b2 = torch.tensor([[3.0,2.0,1.0],[4.0,3.0,2.0],[5.0, 4.0,3.0]])
c1 = torch.tensor([[1.0],[2.0],[3.0]])
c2 = torch.tensor([[4.0],[5.0],[6.0]])

c1 = Variable(c1, requires_grad=True)
c2 = Variable(c2, requires_grad=True)

b = c1.mul(b1) + c2.mul(b2)
b = Variable(b, requires_grad=True)
c = torch.div(a, b) - 1
c = Variable(c, requires_grad=True)
ref = torch.tensor([[1.0,0.0,0.0],[1.0,0.0,0.0],[1.0, 0.0,0.0]])
c.backward(ref, retain_graph=True)
d = b.grad

I want calculate the gradient of b and c1, but I alwazs got d = None. Please help me with this problem

Thank you

Because that ‘replace’ operation
【a = torch.zeros([3,3])
a[0,0] =1.0
a[1,2] =1.0
a[2,1] =1.0】
is not differentiable

Hi, thank you for your reply.

I have changed code remove

c = Variable(c, requires_grad=True)

how I got right b.grad, but for c1.grad is still None.

Thank you, the problem solved.

Only leaf Tensors will have their grad populated during a call to backward() . To get grad populated for non-leaf Tensors, you can use retain_grad() .