When I declare:
x = Variable(torch.ones(2), requires_grad=False)
And then do : x[0]
I still get a tensor of size 1. Indexing further x[0][0] further leads to the same tensor. Is there a way to get the scalar element that is x[0]
When I declare:
x = Variable(torch.ones(2), requires_grad=False)
And then do : x[0]
I still get a tensor of size 1. Indexing further x[0][0] further leads to the same tensor. Is there a way to get the scalar element that is x[0]
Hi @Mika_S,
You can use x.data
to access the underlying storage tensor. As a result, what you are looking for is x.data[0]
.
Awesome. THats what i needed,
Keep in mind that doing this will not work with the autograd and no gradient will be backpropagated through this number.
You can use the torch.Tensor.item method to get the desired result
eg.
x.item()
Note: This will only work if the x is of type torch.Tensor