Actually, it looks like that worked for simple cases but now that I’m running a more complex model it is giving me a:
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation
I ran with with torch.autograd.set_detect_anomaly(True) and it does look like its the copy operations, specifically I am running a batch_norm as you can see in this block of the stack.
File "/home/my_layer.py", line 431, in forward
mainOut.copy_( self.batchNormLayer(mainOut))
File "/home/me/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/nn/modules/module.py", line 541, in __call__
result = self.forward(*input, **kwargs)
File "/home/me/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/nn/modules/batchnorm.py", line 81, in forward
exponential_average_factor, self.eps)
File "/home/me/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/nn/functional.py", line 1670, in batch_norm
training, momentum, eps, torch.backends.cudnn.enabled
Traceback (most recent call last):
File "main.py", line 617, in <module>
main()
File "main.py", line 144, in main
main_worker(args.gpu, ngpus_per_node, args)
File "main.py", line 319, in main_worker
train(train_loader, model, criterion, optimizer, scheduler, epoch, args)
File "main.py", line 431, in train
loss.backward()
File "/home/me/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/tensor.py", line 166, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File "/home/me/anaconda3/envs/fastai/lib/python3.7/site-packages/torch/autograd/__init__.py", line 99, in backward
allow_unreachable=True) # allow_unreachable flag
RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.cuda.FloatTensor [64, 512, 7, 7]], which is output 0 of CudnnConvolutionBackward, is at version 3; expected version 0 instead. Hint: the backtrace further above shows the operation that failed to compute its gradient. The variable in question was changed in there or anywhere later. Good luck!