Error with view, no_grad and inplace modify?

I am trying to do this:

with torch.no_grad():
  for param in net.parameters():
    for j in param.flatten():
        print("current j", j)
        j += 1

However I get this error:

RuntimeError: A view was created in no_grad mode and its base or another view of its base has been modified inplace with grad mode enabled. Given that this use case is ambiguous and error-prone, it is forbidden. You can clarify your code by moving both the view and the inplace either both inside the no_grad block (if you don’t want the inplace to be tracked) or both outside (if you want the inplace to be tracked).

Can someone explain to what has grad mode enabled? The whole block is under no_grad(). Also, the first iterations runs well. I get this output:

current j tensor(0.0919, requires_grad=True)
current j

However in the second iteration of the loop I get the error message above. Why is this happening only in the second iteration?

I’m able to reproduce the issue using your posted code and don’t know why the error is raised since all operations are already in the no_grad guard.
However, it also seems that the print statement is causing the issue as removing it gets rid of the error:

net = nn.Sequential(
    nn.Linear(10, 10),
    nn.ReLU(),
    nn.Linear(10, 10),
    nn.ReLU(),
    nn.Linear(10, 10),
    nn.ReLU(),
)

with torch.no_grad():
  for param in net.parameters():
    for j in param.flatten():
        #print("current j", j)
        j += 1

@albanD do you know if this is a new regression? Let me know if you want me to create an issue to track it.

1 Like

Ho wow this is super fishy… Yes we should definitely open an issue with this small repro!

Created #99968 to track it.

1 Like