I sometimes find myself in a situation where I want to detach a variable from the computational graph, but keep requires_grad=True. Currently I’m doing torch.tensor(oldtensor.detach(), requires_grad=True)
, but this seems clumsy and (I think) copies the data, which is unncessary.
Would it be possible to add a “requires_grad” argument to detach to specify that the detached variable should still require grad (although it is now a leaf)?