Inplace error, in autograd function -pytorch

No, you shouldn’t use the no_grad() guard, if you need to compute gradients. This wrapper is used during validation or testing in order to save memory, as intermediate tensors are not stored, which would be needed to compute the gradients.