Chainer.grad() how to implement this function in Pytorch

I am writing to migrate code from chainer framework to pytorch. I came across the below code:

chainer.grad([loss_func(F.clipped_relu(X2,z=1.0),Yp)], [X2], set_grad=True, retain_grad=True)
I tired looking at the pytorch documentation but was not able to find an equivalent function to implement the above code.

https://docs.chainer.org/en/stable/reference/generated/chainer.grad.html
The above link explains what this function does in chainer framework. I would appreciate if someone could guide me on how to go about migrating the above chainer code to pytorch.

I think torch.autograd.grad would be the corresponding function.

1 Like