Getting data from tensor

Trying to get data from a tensor with

  tensor.data[0]

and I get

AttributeError: 'DoubleTensor' object has no attribute 'data'

Any idea why this is ?

1 Like

Tensors don’t have a data attribute (Variables do). Just use tensor[0].

(Variable is a wrapper around tensor that supports automatic differentiation. Variable.data is the underlying tensor)

4 Likes

Now I realize what the problem is. One of my objects is a variable, the other is a tensor.

<class 'torch.autograd.variable.Variable'>
<class 'torch.FloatTensor'>

How can I convert from the former to the later (autograd variable to floatTesnor) ?

EDIT:
Oh , now I understand, to go from variable to tensor, you just use

 variable.data
1 Like

extra info, FWIW: you will get an error in 0.5 onwards if you try tensor(0)

UserWarning: invalid index of a 0-dim tensor. This will be an error in PyTorch 0.5. Use tensor.item() to convert a 0-dim tensor to a Python number

3 Likes

I think for pytorch0.4 the tensor and tensor.data is the same thing now.

In [6]: type(x)
Out[6]: torch.Tensor

In [7]: type(x.data)
Out[7]: torch.Tensor

In [8]: x.__class__
Out[8]: torch.Tensor

In [9]: x.data.__class__
Out[9]: torch.Tensor
1 Like

.data should be used carefully, as it detaches the tensor from the computation graph and might lead to wrong results.
It still has similar semantics as in the previous versions.

It’s safer to use tensor.detach() instead.

Thanks! I’d like to ask one more question. what’s the meaning of ‘might lead to…’.

So .data is not exactly the same as .detach?

Yes, that’s correct.
Both share the underlying data of the tensor and have requires_grad=False.
While using x.data is unrelated to the computation graph, x.detach will have its in-place changes reported by autograd if x is needed in backward and will raise an error if necessary.
There is an example in the Migration Guide in the “What about .data?” section.

2 Likes

Thanks, @ptrblck!

So to summarize, they are both used to detach tensor from computation graph and returns a tensor that shares the same data, the difference is x.detach() adds another constrain that when the data is changed in-place, the backward wont’t be done.

So why we still need x.data, it this just a historical reason?

1 Like

It’s still used in e.g. optimizers to update the parameters. Although it’s not recommended to use it, there are still valid use cases for .data.

1 Like