Variable vs Tensors.

This topic took too much time for me. And I’m still not understanding the difference between Pytorch Variables and Tensors. All I know is that Variables and Tensors are almost the same. The difference between them is that Tensors don’t have the concept of “gradients” whereas Variables do.

But First, I don’t understand what does the term “gradient” mean?

Could you explain it to me please ??

Hi, AFAIK, `Variable`

is deprecated and now `Tensor`

has gradient. Check https://pytorch.org/tutorials/beginner/blitz/autograd_tutorial.html#sphx-glr-beginner-blitz-autograd-tutorial-py etc. or other official documents.

2 Likes

Thanks for repling me.

But could you explain what does gradient term mean please ??

If you want to train a neural network, you have to perform backpropagation. And backprop relies on updating parameters in the direction of the derivative of a loss value w.r.t the corresponding parameter.

Thus gradient in this term means this derivative.

1 Like