RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed. in case of translation tutorial with batch

Hi,
I have been trying to work through the translation tutorial http://pytorch.org/tutorials/intermediate/seq2seq_translation_tutorial.html but with batching my code is here https://github.com/vijendra-rana/Random/blob/master/translation_with_batch.py#L226. Now for this I have created some fake_data and I am trying to work through that.

When I am running the program I am getting this error

RuntimeError: Trying to backward through the graph second time, but the buffers have already been freed. Please specify retain_variables=True when calling backward for the first time.

I am not able to figure out how am I running the loss.backward() twice I know we cannot run it twice (To me it looks only once.) without retain_variable = True.Also with retain_variable = True I am getting similar error.

Thanks in advance for help.

a common way to step into this issue is if you do something like : total_loss = total_loss + loss inadvertently, say for reporting purposes, instead of loss.data[0]

1 Like