When I declare:
x = Variable(torch.ones(2), requires_grad=False)
And then do : x
I still get a tensor of size 1. Indexing further x further leads to the same tensor. Is there a way to get the scalar element that is x
You can use x.data to access the underlying storage tensor. As a result, what you are looking for is x.data.
Awesome. THats what i needed,
Keep in mind that doing this will not work with the autograd and no gradient will be backpropagated through this number.