Is slicing an in place operation? How should I do it if I need to modify part of the matrix?

Why is this line in-place operation? How should I make it not in-place?

perturbed[masks] = perturbed[masks] + (gradient * epsilon)[masks]
perturbed[masks] = perturbed[masks].clamp(0, 255)


RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.

PyTorch is complaining that I am modifying a variable that requires gradient in place. However, I could not figure out how I can do it in other ways. Can someone kindly help?

It depends whether you want gradients to backpropagate through the update.
It looks like you’re doing a masked gradient step, so these would typically not be backpropagated through.
In this case: Just wrap the two statements in with torch.no_grad():. You could try if “+=” and inplace clamp_ is faster.
If you want to propagate through to the previous version of perturbed, you would need to do the masking first:

update = torch.zeros_like(perturbed)  # doesn't require grad
update[masks] = (gradient * epsilon)[masks] # because update didn't require grad, this works with autograd
perturbed = (perturbed + update).clamp(0, 255) # we replace the name perturbed with a new version, so no inplace business

Best regards