Requires_grad directly after .detach()

foo = foo.detach()
foo.requires_grad = True

I came across this and am trying to understand what it is doing. Does this actually make sense? Does it mean that foo will be disconnected from the current graph and connected again “at a different location” once used again?

Is there a cleaner way to go about this?

Answering my own question: The snippet is within a loop, at the end of the iteration. So I assume it’s purpose is to only back-propagate through one iteration of the loop and not all of them at once.