Hi, I am getting the following error. I tried to solve it by setting inplace as False but still no solution. Thank you for your help.
---------------------------------------------------------------------------
RuntimeError Traceback (most recent call last)
<ipython-input-19-274cc5d71acf> in <module>
1 torch.autograd.set_detect_anomaly(True)
----> 2 run(ARGS)
3
<ipython-input-16-636128776031> in run(args)
26
27 elif args.train:
---> 28 trainLoss,discLoss,gLoss, valLoss,discValLoss,gValLoss = run_epoch(args, model, discriminator)
29 # return trainLoss,discLoss,gLoss, valLoss,discValLoss,gValLoss
30
<ipython-input-9-ff031d02de6a> in run_epoch(args, model, discriminator)
84
85
---> 86 ret_f, ret, disc = run_on_batch(model,discriminator,data,mask,decay,rdecay, args, optimizer,optimizer_d,epoch)#,bmi_norm)
87 print(ret_f)
88 RLoss=RLoss+ret['loss'].item()
~/Work/deep-learning-based-packet-imputation/BiGAN/biGan/bgan_i_ganOrig.ipynb in run_on_batch(model, discriminator, data, mask, decay, rdecay, args, optimizer, optimizer_d, epoch)
~/.local/lib/python3.8/site-packages/torch/tensor.py in backward(self, gradient, retain_graph, create_graph, inputs)
243 create_graph=create_graph,
244 inputs=inputs)
--> 245 torch.autograd.backward(self, gradient, retain_graph, create_graph, inputs=inputs)
246
247 def register_hook(self, hook):
~/.local/lib/python3.8/site-packages/torch/autograd/__init__.py in backward(tensors, grad_tensors, retain_graph, create_graph, grad_variables, inputs)
143 retain_graph = create_graph
144
--> 145 Variable._execution_engine.run_backward(
146 tensors, grad_tensors_, retain_graph, create_graph, inputs,
147 allow_unreachable=True, accumulate_grad=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [5, 1]], which is output 0 of TBackward, is at version 2; expected version 1 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!