Using grad to calculate derivative of two outputs w.r.t. the same input

Hi,

I’m trying to use the autograd to get derivatives of two output values (y_a and y_b) w.r.t. the same input vector (x). I implemented two ways of doing this, which give me different results and I don’t understand why. Only the second method gives me the results.

Method 1: applying x.requires_grad_(True) only once

x.requires_grad_(True)
y_a = … # Calculation of y_a
y_a.backward(retain_graph=True)
dy_a/dx = x.grad
y_b = … # Calculation of y_b
y_b.backward(retain_graph=True)
dy_b/dx = x.grad

Method 2: applying x.requires_grad_(True) twice

x.requires_grad_(True)
y_a = … # Calculation of y_a
y_a.backward()
dy_a/dx = x.grad
x.requires_grad_(True)
y_b = … # Calculation of y_b
y_b.backward()
dy_b/dx = x.grad

Thanks for the help!

Make sure to zero the gradients (if x.grad: x.grad.zero_()) before calculating new y.

Some more info: