I’m new to pytorch. I see some code which always calls
.data to indirectly update a tensor, an example is
the moving average for the target network in DQN
target_param.data.copy_(tau*local_param.data + (1.0-tau)*target_param.data)
but I also notice that some code directly update tensor in
with torch.no_grad(), such as update weights in network
with torch.no_grad(): w -= w.grad
What’s the difference between these two different update methods for tensor?