Gradient computation sees a variable modified but it doesn't seem o

Hi iftg!

Most likely, the output of forward() is being modified inplace after
forward() has run.

self.activation(x) isn’t causing the inplace modification error, per se.
Rather, the presence of self.activation (x) in the computation graph
is causing an inplace modification – that exists with or without the call
to self.activation (x) – to matter.

There is a reasonable chance that adding a .clone(), specifically
x = self.activation (x).clone(), will fix your problem.

Or ask pytorch to sweep this inplace modification error under the rug
for you.

Sometimes these errors are a symptom of something incorrect or
sub-optimal in what you are doing. If you want to track down the root
cause, take a look at the debugging techniques in the following post:

Best.

K. Frank