Theano.clone feature for advanced HMC

I just want to say that I have not really used Pytorch for more than very simple stuff. I’m a hardcore old theano user and for about an year tensorflow. However, atm I’m trying to implement HMC of Neural Networks. In theano we had this great feature theano.clone which essnetially allows you to reconstruct the same functional beahviour, but replace some of the variables in the graph. Unofrtunately tensorflow lacks that and one can track a fair amount of issues raised on github regarding this in relation to HMC. So since I’m doing this now from scracth and I have an implementation in numpy and currently I use tensorflow only for forward and grad calculation and the rest is numpy. I was wondering if whether Pytorch is not a better option.
However, this requires one to be able to either “replace” or “reassign” parameters of the nn module with variables that have come out of the computation and also to release memory of the old graphs (since if you do NUTS you might need to calculate a few tens of iterations and you don’t want to keep arround all that memory). Additionally, even if you reassign a variable does Pytorch know that when I call backward the next time it needs to recompute everything? I did not find a good answer of how this is possible and was hoping if someone with good understanding of the library can weigh in.

Related Issues: What is the recommended way to re-assign/update values in a variable (or tensor)?, Partially reset a Variable, in-place vs. new Variable