What is the difference between Tensors and Variables in Pytorch?

I was looking that there is are Variables and Tensors. Though I don’t quite understand why we need two structures that do essentially what looks to me, the same thing. The docs even say:

Variable API is nearly the same as regular Tensor API (with the exception of a couple in-place methods, that would overwrite inputs required for gradient computation). In most cases Tensors can be safely replaced with Variables and the code will remain to work just fine.

As far as I can tell from reading the docs is that Variables are part of the autograd package thus they can perform gradient computations and stuff. The only clear distinction to me seems to have torch tensors as “placeholders” for data. But variables can already sort of do that by setting the requires_grad=False to false. Thus it just seems odd. Why not only have Tensors and thats it nothing else and avoid making the API more convoluted and confusing than it needs to be (even the docs for Variable aren’t complete cuz they say to go read Tensors API). I am just giving my guesses, I assume there is probably some reason the code is organized as it is and would like to understand it a bit more so that I wrap my head around pytorch (coming from a matlab/tensorflow background).

20 Likes

Think of tensors and variables as same thing. Variables are just wrappers for the tensors so you can now easily auto compute the gradients.

So if a tensor was batman…
A Variable would be batman but with his utility belt on…:grin:

Batman can do the job but if he has his utility belt on he’s got cool gadgets to use😉

50 Likes

torch tensors are actually the data.

variables wrap tensors, and construct a chain of operations between teh tensors, so taht the gradietns can flow back.

so, eg you create variable a, and then add 1 to it to get b. Theres now a link stored between a and b, in the creator property of b. Then when you call .backward() on b, the gradient backpropagates, via the function in b.creator into a.

tensors dont have the concept of gradients, creator etc. they purely store data, and handle operations on that data, like adding, multiplying nad stuff.

27 Likes

But why it is not possible to forward a tensor through a Sequential model (only Variable works for me)

3 Likes

@oblum that’s because we didn’t write the code for Tensors, we wrote it for Variable

11 Likes

The Variable class is a wrapper over torch Tensors (nd arrays in torch) that supports nearly all operations defined on tensors. PyTorch requires that the input tensor to be forward propagated has to be wrapped in a Variable. This facilitates automatic back propagation by simply calling the method backward() in the Variable class.

2 Likes

But as of now, the Variable API has been deprecated, see
https://pytorch.org/docs/stable/autograd.html#variable-deprecated

Hence, we should not bother using Variable to wrap a tensor anymore, right?
Please do confirm? thanks

17 Likes

Correct. (this text is to get past the spam filter).

9 Likes

Aren’t tensors could have a concept of gradients? like

t = torch.randn(3,5, requires_grad=True)

Ah, Variable is now deprecated and tensors could have gradients too…

7 Likes