Understand grad_fn


I am trying to undrestand grad_fn with a simple example, but could not understand it. Can anyone explain the reason of the output? Thanks.

x = torch.tensor(2., requires_grad=True)
a = torch.tensor([1.5], requires_grad=True)
y = x**5

tensor(160., grad_fn=<MulBackward0>)
tensor([120.], grad_fn=<MulBackward0>)


.grad_fn is an internal thing from the autograd, you should not use it.
You can look at it (not call it !) to check which low level Function created a given Tensor but that’s it. In this case you can print(y.grad_fn) to check that it was created by a pow operation.

Hi Alban,

Thanks for the comments. I was thinking it might be something useful, or may be the gradient of a function of y = x**5. But I cannot make sense of the output! If calling the function is not useful, I will not worry too much about it. Thanks.