Leaf variable was used in an inplace operation

Thanks this worked for me. I missed the code before updating the parameters.

with torch.no_grade():

Substituting

x += 1

with

x = x + 1

will work and achieve the desired effect (at the expense of creating a temporary and doing an assignment)

1 Like

Hi, thank you for the detailed reply. However, there is no concrete definition for leaf variables than the one you’ve provided here. In fact, the documentation link you provided says that leaf variables are the tensors which have requires_grad set to False.
This means that the documentation is wrong. (I checked as well)
Please correct the documentation if i’m right.

In pixelCNN code (here for example pixelcnn-pytorch/masked_cnn_layer.py at master · axeloh/pixelcnn-pytorch · GitHub) one can multiply the network parameter (which is a leaf node with requires_grad=True) with a mask tensor but still doesn’t get any error! how is this possible?

You mean the line self.weight.data *= self.mask below?
So it uses the long-deprecated .data which - similar to the recommended with torch.no_grad(): - disconnects the modification from autograd. This is similar to the optimizer update not being seen by autograd.

Best regards

Thomas

1 Like