Encounter in-place operation in delayed online learning

There are several aspects of wrong here. :slight_smile:
One source of error is that the model needs the unmodified data_in but the versioning does not capture regions of data_in. Depending on your setup, you could either use a (nested) list of tensors or generate the inputs outside the loop.

If you fix this ad-hoc by using model(data_in[e,t].clone()), there will be another error from backpropagation with the weights that is triggered when you backprob inside the loop. I would venture that the retain_graph=True is an error here (it almost universally is an error unless you know and can articulate the precise reason why it is not) and you would want to structure your computation to separate forward and backward passes here.
This is my general advice: Don’t keep stuff from the previous training (= forward + backward + optimization) step unless you have a good reason. Personally, I really try to articulate with some precision why things need to be carried over.

Best regards

Thomas

2 Likes