Torch.exp() is modified by an inplace operation

My torch version is 1.5.1. I found a strange behavior of torch.exp():

import torch
x = torch.ones(2, 1, requires_grad=True)
x = torch.exp(x)
x[0] = 0
out = x.mean()
out.backward()

After running this code, I got a running error:

RuntimeError: one of the variables needed for gradient computation has been modified by an inplace operation: [torch.FloatTensor [2, 1]], which is output 0 of ExpBackward, is at version 1; expected version 0 instead. Hint: enable anomaly detection to find the operation that failed to compute its gradient, with torch.autograd.set_detect_anomaly(True).

However, if I replace torch.exp() with torch.log() or torch.sin() in the code, it works very well. Can anyone help me to find the solution? Thank you very much.

Edit: One thing I forgot to mention. If I add x=x+0 before x[0]=0, the code works well too.

1 Like

So I covered some of that along with many “little mysteries” in a course recently (slides 28-35). From the summary slide:

There are two main situations where inplace operations cannot be used:

  • When operating on leaf tensors, the leaf tensor would be moved into the graph, which would be bad.
  • When the operation before the inplace wants to have its result to compute the
    backward. Whether this is the case is not easy to tell from the outside,
    unfortunately.

You are in the second case. What happens is that exp wants its output to compute the backward (which multiplies by the output) while e.g. log wants its input (because the backward divides by the input). This is the “not always easy to tell from the outside part”.
If you do x = x + 0 you are creating a new tensor (which happens to be numerically identical to the old x) and then assign that to x and modify it inplace, but exp finds its output unmodified. This is, btw. the purpose of the ctx.save_for_backward(...), ctx.saved_tensors exrcise in autograd.Functions.

PDF of the slides (and Jupyter Notebook to run the code) are on github, but apparently not enough interest for video.

Of course, the authoritative in-depth reference is @ptrblck’s and my imaginary book from which I made the slides for the talk.

Best regards

Thomas

6 Likes

Hello Thomas. Thanks for your detailed response. It is really helpful to me.

+1 for the video if possible:)

1 Like