Illegal memory access caused by clamp?

(Wei Wang) #1

I have created my own function with forward and backward. The msg says the error occurs in the backward pass with clamp operation. Is tensor.clamp(min=1.e-5) operation forbidden in the backward pass?

THCudaCheck FAIL file=/opt/conda/conda-bld/pytorch_1549635019666/work/aten/src/THC/generated/…/THCReduceAll.cuh line=317 error=77 : an illegal memory access was encountered
Traceback (most recent call last):
File “”, line 347, in
File “”, line 209, in train
File “/home/anaconda3/lib/python3.7/site-packages/torch/”, line 102, in backward
torch.autograd.backward(self, gradient, retain_graph, create_graph)
File “/home/anaconda3/lib/python3.7/site-packages/torch/autograd/”, line 90, in backward
allow_unreachable=True) # allow_unreachable flag
File “/home/anaconda3/lib/python3.7/site-packages/torch/autograd/”, line 76, in apply
return self._forward_cls.backward(self, *args)
File “/home/pycharm/pytorch-cifar/”, line 96, in backward
denominator = torch.norm(
File “/home/anaconda3/lib/python3.7/site-packages/torch/”, line 713, in norm
return torch._C._VariableFunctions.frobenius_norm(input)
RuntimeError: cuda runtime error (77) : an illegal memory access was encountered at /opt/conda/conda-bld/pytorch_1549635019666/work/aten/src/THC/generated/…/THCReduceAll.cuh:317

(Thomas V) #2

Most likely, you’re seeing an earlier error. Use blocking launches to get the location of the error.

Best regards


1 Like