# Geting new value of calulated tensor

Suppose that I created some computational graph like this

``````a = torch.autograd.Variable(torch.ones((2,2)), requires_grad=True)
b = a*2
``````

then I update a values like this

``````a.data = a.data*2
``````

My question how can I get new values of b (that should be now [[4,4],[4,4]]) without “creating” new b (e.g. running b = a*2)?

`Variables` are deprecated since PyTorch `0.4` and the usage of the `.data` attribute is not recommended, will be removed in the future, and might yield side effects.

That being said, could you explain your use case a bit more and what you are trying to achieve?

Lets say that my `b` is:

``````b = 2*a
``````

but to determine that `b = 2*a` and not `b = 3*a` or some other coefficient I had to look up giant table. After that I define some loss function and optimize `a` with Adam. After `a` is updated I want to find new value of b without doing look up again.
You can argue that I just need to save my coefficient in some variable and use that, but my calculation is really complex and I would spend a lot of time to figure out how to rewrite everything in “tensor” form.

Edit: I am hoping that `b = 2*a` is already saved somewhere in the graph so I can just “evaluate” it again.