how to get gradients that still have requires_grad True

Thanks for your reply.

I actually want gradient of wp only wrt phi, so this worked for me,

w = Parameter(T.tensor([2.2]))
phi = Parameter(T.tensor([1.5]))
wp = w*phi
grd = T.autograd.grad(wp, phi, create_graph=True)[0]
print(grd)
grd.backward()
w.grad
print(w.grad)

output:

tensor([2.2000], grad_fn=<MulBackward0>)
tensor([1.])

Using modified last method

w = Parameter(T.tensor([2.2]))
phi = Parameter(T.tensor([1.5]))
wp = w*phi
wp.backward(create_graph=True)
grd = phi.grad
print(grd)
grd.backward()
w.grad
print(w.grad)

output:

tensor([2.2000], grad_fn=<CopyBackwards>)
tensor([2.5000], grad_fn=<CopyBackwards>)

I don’t know what is going on with last method.
Also I found a Quote, which suggest against using .grad in such cases.