Requires_grad argument to detach?

I sometimes find myself in a situation where I want to detach a variable from the computational graph, but keep requires_grad=True. Currently I’m doing torch.tensor(oldtensor.detach(), requires_grad=True), but this seems clumsy and (I think) copies the data, which is unncessary.

Would it be possible to add a “requires_grad” argument to detach to specify that the detached variable should still require grad (although it is now a leaf)?

The idiomatic way to do this is oldtensor.detach().requires_grad_(). That’s even a bit shorter than a keyword argument.

Best regards

Thomas

1 Like

Ah nice, thanks, didn’t know about that.