Operation between Tensor and Variable

Suppose I do this:

x = Variable(torch.rand(3,5))
y = torch.Tensor(torch.rand(3,5))
print(x + y)

It gives error. I can use Variable.data if I want the output to be of Tensor type. But I want the sum to be a Variable. How can I do this?

  1. This example is trivial. When both x and y are matrices, things get complicated (like doing operation between a Variable and a Tensor that has come through register_buffer).
  2. I am new to PyTorch. So is there a good reference that explain all these things?

Why not wrap your y into a Variable object?

Thanks. Do I have to wrap y every time I use it?
Also, this requires gradient with respect to y to be calculated during backward. Since y is a tensor (like running_var in BatchNorm`, I don’t want the gradient to flow through it.

The requires_grad attrribute of a Variable is False by default. This means that the gradient will not flow though y

In [2]: y =  torch.autograd.Variable(torch.Tensor(4))
In [3]: y.requires_grad
Out[3]: False
4 Likes

I also have question about this. Why is it designed like this?